[ 555.898074] env[67081]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 556.526030] env[67131]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 558.073773] env[67131]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67131) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.074154] env[67131]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67131) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.074284] env[67131]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67131) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.074529] env[67131]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 558.075637] env[67131]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 558.194053] env[67131]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67131) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 558.205428] env[67131]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=67131) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 558.307747] env[67131]: INFO nova.virt.driver [None req-3e0b8b56-a6a9-4864-901f-201c9cec77f1 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 558.380323] env[67131]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.380476] env[67131]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.380591] env[67131]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67131) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 561.543071] env[67131]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-c34caae6-f2ec-41ed-87a7-03e6c58a46dd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.558370] env[67131]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67131) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 561.558513] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-d893fbc5-8fcc-492d-b4b8-988429e92c02 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.592724] env[67131]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 3fd52. [ 561.592880] env[67131]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.212s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.593505] env[67131]: INFO nova.virt.vmwareapi.driver [None req-3e0b8b56-a6a9-4864-901f-201c9cec77f1 None None] VMware vCenter version: 7.0.3 [ 561.596922] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38382350-44b6-492e-808b-c1f619dc2d9a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.614209] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311b497e-55f8-4ef0-a4eb-082c297923d3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.620628] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c8a3b0e-1e86-4b39-afe8-66d3750f78e2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.627414] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb25b12e-7354-4e90-885f-b5185d55d6c3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.641629] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-236e8147-19fa-4fdd-9231-ae8c7b1ed9d1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.647536] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803aadc0-b302-41d1-8742-3963c90499a4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.677078] env[67131]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-a2357bdc-1ce6-4031-9c4d-cb0affd28c44 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.682216] env[67131]: DEBUG nova.virt.vmwareapi.driver [None req-3e0b8b56-a6a9-4864-901f-201c9cec77f1 None None] Extension org.openstack.compute already exists. {{(pid=67131) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 561.684997] env[67131]: INFO nova.compute.provider_config [None req-3e0b8b56-a6a9-4864-901f-201c9cec77f1 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 561.703054] env[67131]: DEBUG nova.context [None req-3e0b8b56-a6a9-4864-901f-201c9cec77f1 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),3dbc671c-39ac-4995-84a5-32cfffb8561f(cell1) {{(pid=67131) load_cells /opt/stack/nova/nova/context.py:464}} [ 561.704975] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.705233] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.705947] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.706307] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Acquiring lock "3dbc671c-39ac-4995-84a5-32cfffb8561f" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.706514] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Lock "3dbc671c-39ac-4995-84a5-32cfffb8561f" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.707526] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Lock "3dbc671c-39ac-4995-84a5-32cfffb8561f" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.720394] env[67131]: DEBUG oslo_db.sqlalchemy.engines [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67131) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 561.720794] env[67131]: DEBUG oslo_db.sqlalchemy.engines [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67131) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 561.728145] env[67131]: ERROR nova.db.main.api [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 561.728145] env[67131]: result = function(*args, **kwargs) [ 561.728145] env[67131]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 561.728145] env[67131]: return func(*args, **kwargs) [ 561.728145] env[67131]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 561.728145] env[67131]: result = fn(*args, **kwargs) [ 561.728145] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 561.728145] env[67131]: return f(*args, **kwargs) [ 561.728145] env[67131]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 561.728145] env[67131]: return db.service_get_minimum_version(context, binaries) [ 561.728145] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 561.728145] env[67131]: _check_db_access() [ 561.728145] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 561.728145] env[67131]: stacktrace = ''.join(traceback.format_stack()) [ 561.728145] env[67131]: [ 561.728822] env[67131]: ERROR nova.db.main.api [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 561.728822] env[67131]: result = function(*args, **kwargs) [ 561.728822] env[67131]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 561.728822] env[67131]: return func(*args, **kwargs) [ 561.728822] env[67131]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 561.728822] env[67131]: result = fn(*args, **kwargs) [ 561.728822] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 561.728822] env[67131]: return f(*args, **kwargs) [ 561.728822] env[67131]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 561.728822] env[67131]: return db.service_get_minimum_version(context, binaries) [ 561.728822] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 561.728822] env[67131]: _check_db_access() [ 561.728822] env[67131]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 561.728822] env[67131]: stacktrace = ''.join(traceback.format_stack()) [ 561.728822] env[67131]: [ 561.729216] env[67131]: WARNING nova.objects.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 561.729332] env[67131]: WARNING nova.objects.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Failed to get minimum service version for cell 3dbc671c-39ac-4995-84a5-32cfffb8561f [ 561.729784] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Acquiring lock "singleton_lock" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.729911] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Acquired lock "singleton_lock" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.730167] env[67131]: DEBUG oslo_concurrency.lockutils [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Releasing lock "singleton_lock" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.730497] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Full set of CONF: {{(pid=67131) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 561.730639] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ******************************************************************************** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 561.730765] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] Configuration options gathered from: {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 561.730897] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 561.731101] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 561.731237] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ================================================================================ {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 561.731439] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] allow_resize_to_same_host = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.731607] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] arq_binding_timeout = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.731731] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] backdoor_port = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.731856] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] backdoor_socket = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732029] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] block_device_allocate_retries = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732195] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] block_device_allocate_retries_interval = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732361] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cert = self.pem {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732524] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732687] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute_monitors = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.732846] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] config_dir = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733017] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] config_drive_format = iso9660 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733153] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733316] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] config_source = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733479] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] console_host = devstack {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733641] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] control_exchange = nova {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733796] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cpu_allocation_ratio = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.733951] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] daemon = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734129] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] debug = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734284] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] default_access_ip_network_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734445] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] default_availability_zone = nova {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734594] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] default_ephemeral_format = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734826] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.734983] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] default_schedule_zone = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735144] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] disk_allocation_ratio = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735303] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] enable_new_services = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735480] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] enabled_apis = ['osapi_compute'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735635] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] enabled_ssl_apis = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735787] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] flat_injected = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.735941] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] force_config_drive = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736112] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] force_raw_images = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736279] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] graceful_shutdown_timeout = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736433] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] heal_instance_info_cache_interval = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736638] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] host = cpu-1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736800] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.736973] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] initial_disk_allocation_ratio = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.737159] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] initial_ram_allocation_ratio = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.737375] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.737533] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_build_timeout = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.737687] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_delete_interval = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.737849] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_format = [instance: %(uuid)s] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738014] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_name_template = instance-%08x {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738177] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_usage_audit = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738345] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_usage_audit_period = month {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738509] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738674] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] instances_path = /opt/stack/data/nova/instances {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.738834] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] internal_service_availability_zone = internal {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739015] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] key = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739186] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] live_migration_retry_count = 30 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739349] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_config_append = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739513] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739667] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_dir = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739819] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.739944] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_options = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740116] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_rotate_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740281] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_rotate_interval_type = days {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740443] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] log_rotation_type = none {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740566] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740686] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.740848] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741020] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741146] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741304] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] long_rpc_timeout = 1800 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741456] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_concurrent_builds = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741606] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_concurrent_live_migrations = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741755] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_concurrent_snapshots = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.741908] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_local_block_devices = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742069] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_logfile_count = 30 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742223] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] max_logfile_size_mb = 200 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742373] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] maximum_instance_delete_attempts = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742535] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metadata_listen = 0.0.0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742696] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metadata_listen_port = 8775 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.742856] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metadata_workers = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743023] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] migrate_max_retries = -1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743191] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] mkisofs_cmd = genisoimage {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743392] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] my_block_storage_ip = 10.180.1.21 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743520] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] my_ip = 10.180.1.21 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743674] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] network_allocate_retries = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.743847] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744015] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] osapi_compute_listen = 0.0.0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744193] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] osapi_compute_listen_port = 8774 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744417] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] osapi_compute_unique_server_name_scope = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744592] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] osapi_compute_workers = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744751] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] password_length = 12 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.744907] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] periodic_enable = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745078] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] periodic_fuzzy_delay = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745245] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] pointer_model = usbtablet {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745408] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] preallocate_images = none {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745563] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] publish_errors = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745688] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] pybasedir = /opt/stack/nova {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745837] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ram_allocation_ratio = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.745994] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rate_limit_burst = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.746174] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rate_limit_except_level = CRITICAL {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.746330] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rate_limit_interval = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.746483] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reboot_timeout = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.746635] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reclaim_instance_interval = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.746789] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] record = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747026] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reimage_timeout_per_gb = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747230] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] report_interval = 120 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747392] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rescue_timeout = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747547] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reserved_host_cpus = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747703] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reserved_host_disk_mb = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.747859] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reserved_host_memory_mb = 512 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748020] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] reserved_huge_pages = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748173] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] resize_confirm_window = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748329] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] resize_fs_using_block_device = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748481] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] resume_guests_state_on_host_boot = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748644] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.748797] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rpc_response_timeout = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749022] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] run_external_periodic_tasks = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749198] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] running_deleted_instance_action = reap {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749356] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] running_deleted_instance_poll_interval = 1800 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749506] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] running_deleted_instance_timeout = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749658] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler_instance_sync_interval = 120 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749787] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_down_time = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.749952] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] servicegroup_driver = db {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750119] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] shelved_offload_time = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750273] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] shelved_poll_interval = 3600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750432] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] shutdown_timeout = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750585] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] source_is_ipv6 = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750740] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ssl_only = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.750985] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.751168] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] sync_power_state_interval = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.751325] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] sync_power_state_pool_size = 1000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.751562] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] syslog_log_facility = LOG_USER {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.751750] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] tempdir = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.751914] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] timeout_nbd = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752098] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] transport_url = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752257] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] update_resources_interval = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752411] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_cow_images = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752563] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_eventlog = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752719] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_journal = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.752870] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_json = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753035] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_rootwrap_daemon = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753198] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_stderr = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753350] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] use_syslog = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753499] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vcpu_pin_set = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753661] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plugging_is_fatal = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753821] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plugging_timeout = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.753981] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] virt_mkfs = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.754152] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] volume_usage_poll_interval = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.754309] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] watch_log_file = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.754470] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] web = /usr/share/spice-html5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.754651] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_concurrency.disable_process_locking = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.754950] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755144] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755310] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755477] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755645] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755804] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.755982] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.auth_strategy = keystone {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.756163] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.compute_link_prefix = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.756337] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.756506] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.dhcp_domain = novalocal {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.756671] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.enable_instance_password = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.756846] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.glance_link_prefix = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757055] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757236] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757392] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.instance_list_per_project_cells = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757545] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.list_records_by_skipping_down_cells = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757705] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.local_metadata_per_cell = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.757869] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.max_limit = 1000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.758043] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.metadata_cache_expiration = 15 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.758223] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.neutron_default_tenant_id = default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.758445] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.use_forwarded_for = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.758656] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.use_neutron_default_nets = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.758836] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759037] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759219] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759395] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759564] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_dynamic_targets = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759725] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_jsonfile_path = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.759923] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.760150] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.backend = dogpile.cache.memcached {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.760319] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.backend_argument = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.760486] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.config_prefix = cache.oslo {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.760650] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.dead_timeout = 60.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.760805] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.debug_cache_backend = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761021] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.enable_retry_client = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761171] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.enable_socket_keepalive = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761342] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.enabled = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761502] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.expiration_time = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761661] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.hashclient_retry_attempts = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761822] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.hashclient_retry_delay = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.761980] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_dead_retry = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_password = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_pool_maxsize = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_sasl_enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763239] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763591] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_socket_timeout = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763591] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.memcache_username = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763591] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.proxies = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763718] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.retry_attempts = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.763856] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.retry_delay = 0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764044] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.socket_keepalive_count = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764267] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.socket_keepalive_idle = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764442] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.socket_keepalive_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764603] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.tls_allowed_ciphers = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764760] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.tls_cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.764911] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.tls_certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765084] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.tls_enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765244] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cache.tls_keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765415] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765588] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.auth_type = password {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765745] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.765925] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.catalog_info = volumev3::publicURL {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766110] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766275] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766434] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.cross_az_attach = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766591] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.debug = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766750] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.endpoint_template = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.766937] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.http_retries = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767117] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767276] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767451] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.os_region_name = RegionOne {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767612] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767770] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cinder.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.767938] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768110] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.cpu_dedicated_set = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768269] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.cpu_shared_set = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768432] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.image_type_exclude_list = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768597] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768756] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.max_concurrent_disk_ops = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.768932] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.max_disk_devices_to_attach = -1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769122] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769297] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769460] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.resource_provider_association_refresh = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769620] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.shutdown_retry_interval = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769798] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.769979] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] conductor.workers = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770164] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] console.allowed_origins = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770320] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] console.ssl_ciphers = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770486] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] console.ssl_minimum_version = default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770652] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] consoleauth.token_ttl = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770824] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.770978] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771152] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771308] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771461] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771612] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771770] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.771923] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772090] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772249] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772401] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772554] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772718] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.service_type = accelerator {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.772878] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773068] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773230] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773383] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773559] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773713] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] cyborg.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.773892] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.backend = sqlalchemy {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774079] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.connection = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774248] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.connection_debug = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774418] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.connection_parameters = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774579] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.connection_recycle_time = 3600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774742] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.connection_trace = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.774901] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.db_inc_retry_interval = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775071] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.db_max_retries = 20 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775234] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.db_max_retry_interval = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775395] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.db_retry_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775561] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.max_overflow = 50 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775720] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.max_pool_size = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.775884] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.max_retries = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776051] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.mysql_enable_ndb = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776226] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776382] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.mysql_wsrep_sync_wait = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776537] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.pool_timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776711] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.retry_interval = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.776883] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.slave_connection = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777066] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.sqlite_synchronous = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777234] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] database.use_db_reconnect = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777415] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.backend = sqlalchemy {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777588] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.connection = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777752] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.connection_debug = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.777919] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.connection_parameters = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778091] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.connection_recycle_time = 3600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778256] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.connection_trace = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778415] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.db_inc_retry_interval = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778573] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.db_max_retries = 20 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778732] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.db_max_retry_interval = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.778893] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.db_retry_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779091] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.max_overflow = 50 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779260] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.max_pool_size = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779427] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.max_retries = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779588] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.mysql_enable_ndb = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779757] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.779932] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780118] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.pool_timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780290] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.retry_interval = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780448] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.slave_connection = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780613] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] api_database.sqlite_synchronous = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780784] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] devices.enabled_mdev_types = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.780959] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.781134] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ephemeral_storage_encryption.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.781294] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.781461] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.api_servers = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783104] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783300] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783474] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783668] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783797] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.783961] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.debug = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784144] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.default_trusted_certificate_ids = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784311] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.enable_certificate_validation = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784475] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.enable_rbd_download = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784636] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784800] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.784986] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.785180] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.785342] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.785503] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.num_retries = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.785673] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.rbd_ceph_conf = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.785835] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.rbd_connect_timeout = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786017] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.rbd_pool = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786194] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.rbd_user = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786353] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786510] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786677] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.service_type = image {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786840] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.786998] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787171] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787328] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787509] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787671] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.verify_glance_signatures = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787828] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] glance.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.787995] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] guestfs.debug = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.788179] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.config_drive_cdrom = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.788339] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.config_drive_inject_password = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.788500] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.788660] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.enable_instance_metrics_collection = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.788820] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.enable_remotefx = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789018] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.instances_path_share = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789200] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.iscsi_initiator_list = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789361] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.limit_cpu_features = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789520] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789678] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.789841] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.power_state_check_timeframe = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790033] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790214] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790375] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.use_multipath_io = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790534] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.volume_attach_retry_count = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790689] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.790843] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.vswitch_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.791006] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.791184] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] mks.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.791529] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.791715] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.manager_interval = 2400 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.791883] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.precache_concurrency = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792062] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.remove_unused_base_images = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792235] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792399] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792571] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] image_cache.subdirectory_name = _base {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792740] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.api_max_retries = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.792899] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.api_retry_interval = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793101] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793232] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.auth_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793386] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793539] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793709] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.793849] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794019] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794179] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794339] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794493] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794647] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794801] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.794958] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.partition_key = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795137] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.peer_list = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795292] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795450] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.serial_console_state_timeout = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795604] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795769] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.service_type = baremetal {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.795927] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796098] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796253] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796406] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796581] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796735] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ironic.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.796940] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797134] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] key_manager.fixed_key = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797315] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797473] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.barbican_api_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797628] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.barbican_endpoint = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797811] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.barbican_endpoint_type = public {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.797945] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.barbican_region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.798118] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.798309] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.798495] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.798660] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.798817] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.799050] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.number_of_retries = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.retry_delay = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.send_service_user_token = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.verify_ssl = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican.verify_ssl_path = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800552] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800881] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.auth_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800881] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.800967] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.801162] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.801357] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.801523] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.801684] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.801838] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] barbican_service_user.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802011] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.approle_role_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802176] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.approle_secret_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802333] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802484] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802641] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802796] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.802949] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803135] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.kv_mountpoint = secret {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803299] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.kv_version = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803456] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.namespace = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803611] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.root_token_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803816] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.803920] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.ssl_ca_crt_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804085] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804247] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.use_ssl = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804416] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804581] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804736] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.804896] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805067] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805224] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805380] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805539] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805696] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.805851] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806021] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806184] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806339] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806506] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.service_type = identity {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806665] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.806843] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807034] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807202] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807384] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807542] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] keystone.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807787] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.connection_uri = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.807904] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_mode = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808081] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_model_extra_flags = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808251] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_models = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808418] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_power_governor_high = performance {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808584] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_power_governor_low = powersave {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808744] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_power_management = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.808936] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809121] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.device_detach_attempts = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809286] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.device_detach_timeout = 20 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809450] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.disk_cachemodes = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809607] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.disk_prefix = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809770] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.enabled_perf_events = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.809991] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.file_backed_memory = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.810190] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.gid_maps = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.810355] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.hw_disk_discard = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.810515] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.hw_machine_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.810687] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_rbd_ceph_conf = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.810852] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811026] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811467] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_rbd_glance_store_name = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811467] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_rbd_pool = rbd {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811528] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_type = default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811675] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.images_volume_group = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811834] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.inject_key = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.811993] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.inject_partition = -2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812168] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.inject_password = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812328] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.iscsi_iface = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812487] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.iser_use_multipath = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812646] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_bandwidth = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812805] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.812964] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_downtime = 500 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813138] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813297] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813452] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_inbound_addr = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813610] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813766] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_permit_post_copy = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.813923] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_scheme = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814109] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_timeout_action = abort {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814271] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_tunnelled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814444] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_uri = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814624] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.live_migration_with_native_tls = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814784] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.max_queues = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.814946] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.815119] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.nfs_mount_options = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.815419] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.815590] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.815752] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_iser_scan_tries = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.815911] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_memory_encrypted_guests = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.816085] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.816249] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_pcie_ports = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.816412] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.num_volume_scan_tries = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.816575] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.pmem_namespaces = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.816732] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.quobyte_client_cfg = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817096] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817283] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rbd_connect_timeout = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817452] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817616] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817773] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rbd_secret_uuid = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.817930] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rbd_user = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818103] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818276] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.remote_filesystem_transport = ssh {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818433] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rescue_image_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818586] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rescue_kernel_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818741] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rescue_ramdisk_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.818925] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.819117] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.rx_queue_size = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.819294] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.smbfs_mount_options = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.819566] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.819734] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.snapshot_compression = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.819894] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.snapshot_image_format = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820147] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820321] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.sparse_logical_volumes = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820482] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.swtpm_enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820651] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.swtpm_group = tss {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820817] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.swtpm_user = tss {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.820983] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.sysinfo_serial = unique {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.821155] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.tx_queue_size = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.821344] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.uid_maps = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.821515] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.use_virtio_for_bridges = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.821687] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.virt_type = kvm {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.821854] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.volume_clear = zero {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822021] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.volume_clear_size = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822190] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.volume_use_multipath = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822349] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_cache_path = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822519] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822687] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_mount_group = qemu {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.822852] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_mount_opts = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.823029] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.823312] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.823491] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.vzstorage_mount_user = stack {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.823656] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.823828] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824036] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.auth_type = password {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824171] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824331] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824494] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824651] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824807] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.824976] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.default_floating_pool = public {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825149] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825310] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.extension_sync_interval = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825469] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.http_retries = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825628] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825782] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.825939] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826122] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826282] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826448] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.ovs_bridge = br-int {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826635] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.physnets = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826819] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.region_name = RegionOne {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.826992] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.service_metadata_proxy = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827168] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827338] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.service_type = network {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827499] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827653] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827806] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.827970] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.828146] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.828305] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] neutron.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.828474] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] notifications.bdms_in_notifications = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.828649] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] notifications.default_level = INFO {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.828821] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] notifications.notification_format = unversioned {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829024] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] notifications.notify_on_state_change = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829212] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829390] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] pci.alias = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829559] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] pci.device_spec = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829722] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] pci.report_in_placement = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.829899] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830119] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.auth_type = password {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830299] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830460] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830616] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830774] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.830931] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831101] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831291] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.default_domain_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831462] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.default_domain_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831620] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.domain_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831776] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.domain_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.831932] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832107] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832266] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832421] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832575] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832739] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.password = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.832896] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.project_domain_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833071] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.project_domain_name = Default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833237] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.project_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833407] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.project_name = service {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833575] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.region_name = RegionOne {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833731] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.833895] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.service_type = placement {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834065] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834225] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834380] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834535] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.system_scope = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834685] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834839] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.trust_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.834993] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.user_domain_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.835173] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.user_domain_name = Default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.835331] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.user_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.835501] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.username = placement {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.835677] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.835832] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] placement.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836008] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.cores = 20 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836182] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.count_usage_from_placement = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836354] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836524] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.injected_file_content_bytes = 10240 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836685] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.injected_file_path_length = 255 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.836845] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.injected_files = 5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837015] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.instances = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837185] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.key_pairs = 100 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837347] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.metadata_items = 128 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837538] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.ram = 51200 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837673] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.recheck_quota = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837835] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.server_group_members = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.837997] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] quota.server_groups = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.838181] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rdp.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.838523] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.838742] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.838931] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839123] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.image_metadata_prefilter = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839293] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839459] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.max_attempts = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839622] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.max_placement_results = 1000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839784] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.839964] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.query_placement_for_availability_zone = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.840153] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.query_placement_for_image_type_support = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.840316] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.840489] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] scheduler.workers = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.840660] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.840830] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841019] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841200] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841388] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841556] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841718] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.841913] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842099] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.host_subset_size = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842262] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842424] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842589] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.isolated_hosts = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842753] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.isolated_images = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.842916] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843089] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843251] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.pci_in_placement = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843411] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843568] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843725] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.843882] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844062] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844238] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844393] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.track_instance_changes = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844565] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844731] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metrics.required = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.844893] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metrics.weight_multiplier = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.845064] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.845231] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] metrics.weight_setting = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.845520] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.845691] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.845865] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.port_range = 10000:20000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846043] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846219] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846385] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] serial_console.serialproxy_port = 6083 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846550] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846720] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.auth_type = password {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.846876] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847038] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847202] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847358] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847514] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847682] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.send_service_user_token = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847842] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.847999] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] service_user.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.848181] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.agent_enabled = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.848355] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.848652] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.848843] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849055] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.html5proxy_port = 6082 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849231] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.image_compression = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849395] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.jpeg_compression = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849556] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.playback_compression = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849726] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.server_listen = 127.0.0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.849906] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850107] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.streaming_mode = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850280] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] spice.zlib_compression = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850448] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] upgrade_levels.baseapi = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850608] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] upgrade_levels.cert = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850782] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] upgrade_levels.compute = auto {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.850941] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] upgrade_levels.conductor = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851122] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] upgrade_levels.scheduler = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851311] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851484] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.auth_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851644] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851799] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.851961] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852166] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852338] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852503] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852661] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vendordata_dynamic_auth.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852834] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.api_retry_count = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.852992] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.ca_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.853183] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.cache_prefix = devstack-image-cache {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.853350] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.cluster_name = testcl1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.853512] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.connection_pool_size = 10 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.853670] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.console_delay_seconds = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.853837] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.datastore_regex = ^datastore.* {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854050] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854229] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.host_password = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854433] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.host_port = 443 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854562] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.host_username = administrator@vsphere.local {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854729] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.insecure = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.854889] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.integration_bridge = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855060] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.maximum_objects = 100 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855225] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.pbm_default_policy = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855385] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.pbm_enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855541] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.pbm_wsdl_location = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855707] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.855863] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.serial_port_proxy_uri = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856026] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.serial_port_service_uri = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856196] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.task_poll_interval = 0.5 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856368] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.use_linked_clone = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856537] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.vnc_keymap = en-us {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856697] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.vnc_port = 5900 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.856857] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vmware.vnc_port_total = 10000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.857052] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.auth_schemes = ['none'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.857233] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.857515] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.857698] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.857867] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.novncproxy_port = 6080 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858052] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.server_listen = 127.0.0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858235] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858388] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.vencrypt_ca_certs = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858543] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.vencrypt_client_cert = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858697] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vnc.vencrypt_client_key = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.858873] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.859124] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_deep_image_inspection = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.859313] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.859573] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.859844] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.860137] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.disable_rootwrap = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.860380] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.enable_numa_live_migration = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.860564] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.860729] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.860890] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861064] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.libvirt_disable_apic = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861247] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861455] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861622] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861784] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.861945] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862122] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862283] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862442] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862601] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862763] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.862948] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863133] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.client_socket_timeout = 900 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863300] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.default_pool_size = 1000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863464] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.keep_alive = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863628] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.max_header_line = 16384 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863790] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.secure_proxy_ssl_header = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.863952] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.ssl_ca_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864128] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.ssl_cert_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864288] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.ssl_key_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864452] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.tcp_keepidle = 600 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864627] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864791] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] zvm.ca_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.864951] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] zvm.cloud_connector_url = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.865259] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.865433] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] zvm.reachable_timeout = 300 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.865615] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.enforce_new_defaults = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.865783] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.enforce_scope = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.865968] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.policy_default_rule = default {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.866178] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.866358] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.policy_file = policy.yaml {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.866531] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.866690] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.866859] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867017] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867186] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867354] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867527] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867700] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.connection_string = messaging:// {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.867863] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.enabled = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868041] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.es_doc_type = notification {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868208] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.es_scroll_size = 10000 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868376] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.es_scroll_time = 2m {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868536] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.filter_error_trace = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868700] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.hmac_keys = SECRET_KEY {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.868865] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.sentinel_service_name = mymaster {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869082] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.socket_timeout = 0.1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869260] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] profiler.trace_sqlalchemy = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869429] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] remote_debug.host = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869589] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] remote_debug.port = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869764] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.869944] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870163] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870335] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870497] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870657] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870813] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.870971] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.871144] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.871326] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.871509] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.871676] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.871841] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872013] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872183] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872357] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872517] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872676] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872835] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.872995] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873169] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873332] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873490] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873647] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873809] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.873970] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.874159] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.874327] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.874485] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.874652] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.874818] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_rabbit.ssl_version = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875009] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875180] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_notifications.retry = -1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875359] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875532] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_messaging_notifications.transport_url = **** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875698] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.auth_section = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.875858] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.auth_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876023] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.cafile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876186] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.certfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876345] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.collect_timing = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876499] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.connect_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876653] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.connect_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876806] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.endpoint_id = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.876960] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.endpoint_override = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877129] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.insecure = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877285] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.keyfile = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877438] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.max_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877591] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.min_version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877744] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.region_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.877897] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.service_name = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878063] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.service_type = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878226] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.split_loggers = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878384] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.status_code_retries = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878537] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.status_code_retry_delay = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878691] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.timeout = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.878845] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.valid_interfaces = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879040] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_limit.version = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879221] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_reports.file_event_handler = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879385] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879542] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] oslo_reports.log_dir = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879708] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.879865] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880076] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880263] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880428] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880585] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880752] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.880908] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.group = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881080] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881262] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881439] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881598] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] vif_plug_ovs_privileged.user = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881766] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.flat_interface = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.881945] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882134] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882308] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882478] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882640] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882802] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.882959] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883150] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.isolate_vif = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883317] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883480] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883650] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883817] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.ovsdb_interface = native {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.883978] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_vif_ovs.per_port_bridge = False {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884156] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_brick.lock_path = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884317] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884474] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884638] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.capabilities = [21] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884793] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.group = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.884948] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.helper_command = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885127] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885288] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885441] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] privsep_osbrick.user = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885608] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885764] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.group = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.885919] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.helper_command = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.886092] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.886253] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.886407] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] nova_sys_admin.user = None {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.886588] env[67131]: DEBUG oslo_service.service [None req-45d480f3-cac2-4e11-aa93-153a612ae91c None None] ******************************************************************************** {{(pid=67131) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 561.887021] env[67131]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 561.895170] env[67131]: INFO nova.virt.node [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Generated node identity d05f24fe-4395-4079-99ef-1ac1245f55e5 [ 561.895402] env[67131]: INFO nova.virt.node [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Wrote node identity d05f24fe-4395-4079-99ef-1ac1245f55e5 to /opt/stack/data/n-cpu-1/compute_id [ 561.906644] env[67131]: WARNING nova.compute.manager [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Compute nodes ['d05f24fe-4395-4079-99ef-1ac1245f55e5'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 561.937601] env[67131]: INFO nova.compute.manager [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 561.959910] env[67131]: WARNING nova.compute.manager [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 561.960172] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.960394] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.960536] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.960687] env[67131]: DEBUG nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 561.961802] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44567ee2-8493-4e4a-b165-130c95737348 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.971187] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1349fee2-404c-4b08-8575-874d75f2d36d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.984476] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7f460e-2961-4b03-9511-44404ff9ba8c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.990768] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c54cc8a-f965-498c-aee3-df68b4cd1212 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.019647] env[67131]: DEBUG nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 562.019833] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.019984] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.032024] env[67131]: WARNING nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] No compute node record for cpu-1:d05f24fe-4395-4079-99ef-1ac1245f55e5: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host d05f24fe-4395-4079-99ef-1ac1245f55e5 could not be found. [ 562.043886] env[67131]: INFO nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: d05f24fe-4395-4079-99ef-1ac1245f55e5 [ 562.093153] env[67131]: DEBUG nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 562.093348] env[67131]: DEBUG nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 562.191605] env[67131]: INFO nova.scheduler.client.report [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] [req-54b07a73-ab03-46e7-8b18-00b8392b86ee] Created resource provider record via placement API for resource provider with UUID d05f24fe-4395-4079-99ef-1ac1245f55e5 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 562.207289] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07f9e3a-dc66-4550-8d2e-083e586307de {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.214721] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9346a3c-bf08-43c2-97d3-0d93145acda8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.243995] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce565938-f3ad-421b-b87d-e9c173b6b2aa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.251601] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62db3356-64a7-4c23-9414-5d546e1147c1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.264960] env[67131]: DEBUG nova.compute.provider_tree [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Updating inventory in ProviderTree for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.300206] env[67131]: DEBUG nova.scheduler.client.report [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Updated inventory for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 562.300439] env[67131]: DEBUG nova.compute.provider_tree [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Updating resource provider d05f24fe-4395-4079-99ef-1ac1245f55e5 generation from 0 to 1 during operation: update_inventory {{(pid=67131) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.300579] env[67131]: DEBUG nova.compute.provider_tree [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Updating inventory in ProviderTree for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.343283] env[67131]: DEBUG nova.compute.provider_tree [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Updating resource provider d05f24fe-4395-4079-99ef-1ac1245f55e5 generation from 1 to 2 during operation: update_traits {{(pid=67131) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.361831] env[67131]: DEBUG nova.compute.resource_tracker [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 562.362082] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.362264] env[67131]: DEBUG nova.service [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Creating RPC server for service compute {{(pid=67131) start /opt/stack/nova/nova/service.py:182}} [ 562.374450] env[67131]: DEBUG nova.service [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] Join ServiceGroup membership for this service compute {{(pid=67131) start /opt/stack/nova/nova/service.py:199}} [ 562.374631] env[67131]: DEBUG nova.servicegroup.drivers.db [None req-0280b1b4-0126-4f74-b25c-fb111bc4491e None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67131) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 600.379665] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._sync_power_states {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 600.392948] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Getting list of instances from cluster (obj){ [ 600.392948] env[67131]: value = "domain-c8" [ 600.392948] env[67131]: _type = "ClusterComputeResource" [ 600.392948] env[67131]: } {{(pid=67131) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 600.395040] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80a037b8-93d1-40d9-afc6-9137c3705119 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.410184] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Got total of 0 instances {{(pid=67131) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 600.410184] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 600.410184] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Getting list of instances from cluster (obj){ [ 600.410184] env[67131]: value = "domain-c8" [ 600.410184] env[67131]: _type = "ClusterComputeResource" [ 600.410184] env[67131]: } {{(pid=67131) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 600.413672] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd648d8a-f75a-48b3-8520-b27d8bccab1c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.422573] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Got total of 0 instances {{(pid=67131) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 602.364613] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.364613] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.379698] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 602.484019] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.484019] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.485337] env[67131]: INFO nova.compute.claims [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 602.640186] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdbf045f-c795-47e5-aed2-40ac2fee9e17 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.652623] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78140324-5d84-4639-8c19-dd468ce7c472 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.688164] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf149dde-d540-4adb-977d-e1c76c1d3108 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.700304] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99edebea-5fd8-4c2e-932c-e34f6d539630 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.719449] env[67131]: DEBUG nova.compute.provider_tree [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 602.728703] env[67131]: DEBUG nova.scheduler.client.report [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 602.749830] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.750446] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 602.791518] env[67131]: DEBUG nova.compute.utils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 602.797251] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 602.797251] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 602.815012] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 602.912975] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 603.113876] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "e55f1592-024d-431d-b3a9-63b27513cac4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.115367] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "e55f1592-024d-431d-b3a9-63b27513cac4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.132259] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 603.194023] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.194023] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.194023] env[67131]: INFO nova.compute.claims [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 603.310331] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cfe18cd-4fba-4e2a-ba63-9b33d9c31c4a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.324304] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de313c84-cead-4e72-b3f5-7709f53b2f25 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.360809] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-226cfd82-3dd2-4f54-bc13-99d905584c6a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.370676] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46f3f843-5a16-4f82-8544-202bce68d608 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.382781] env[67131]: DEBUG nova.compute.provider_tree [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.394678] env[67131]: DEBUG nova.scheduler.client.report [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.415083] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.223s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.415660] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 603.464386] env[67131]: DEBUG nova.compute.utils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 603.465751] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 603.465975] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 603.480615] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 603.558180] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 604.417371] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.417371] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.427773] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 604.484315] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.484315] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.485693] env[67131]: INFO nova.compute.claims [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 604.615811] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-200f23ed-6c29-43ad-a35d-9d1cfd2413fe {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.624268] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffbd550e-e170-4696-8308-81ffafef417c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.655457] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-457c649f-c610-4c06-abb3-5037b49a0754 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.662874] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eff5732-cac7-41b1-89c4-20e09f75ce37 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.678017] env[67131]: DEBUG nova.compute.provider_tree [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 604.686016] env[67131]: DEBUG nova.scheduler.client.report [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 604.700732] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.701267] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 604.740764] env[67131]: DEBUG nova.compute.utils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 604.741012] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Not allocating networking since 'none' was specified. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 604.756247] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 604.830164] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 604.855871] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 604.856380] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 604.856699] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 604.859112] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 604.859112] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 604.859112] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 604.859112] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 604.859112] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 604.859317] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 604.859317] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 604.859317] env[67131]: DEBUG nova.virt.hardware [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 604.861557] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bbfaac-937b-4466-b17d-0360d9283eeb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.883347] env[67131]: DEBUG nova.policy [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '830dc22fd41c4597b235fbd4d7cab38a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aee7b72037a348e79ad015a4997259f9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.886697] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353c39ed-550f-4fe7-a064-98d4aa4d530f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.894487] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 604.895046] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 604.895312] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 604.895601] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 604.895852] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 604.896233] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 604.896550] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 604.896826] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 604.897132] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 604.899034] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 604.899034] env[67131]: DEBUG nova.virt.hardware [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 604.900171] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eba87f54-f994-44f9-949b-963a449b806e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.918237] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 604.918665] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 604.919208] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 604.919511] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 604.919802] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 604.920075] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 604.920413] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 604.920723] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 604.921031] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 604.922654] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 604.922654] env[67131]: DEBUG nova.virt.hardware [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 604.926153] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d93ac32a-8ae7-4bc7-b1a1-8ca044a0edaf {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.937885] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-022f7581-3c43-43c0-8f96-fef276f7ac90 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.942345] env[67131]: DEBUG nova.policy [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c769718f96754f21b2cf18f4eb80bc5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70c482ddac1f489e8432dfd3e2b67dfd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.949447] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d56f2368-3171-4f13-9ef4-ee95d79e82b7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.971149] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6930943b-d4d2-43e7-a2c9-045c242e170b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.975560] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Instance VIF info [] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 604.984991] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 604.985319] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87724998-3503-4abb-ad36-252a69edbf3c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.998556] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Created folder: OpenStack in parent group-v4. [ 604.998833] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating folder: Project (adf3ce5e82744ca2ae24347f6948658f). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 604.998971] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0e31fda-8fc6-4438-9a1f-2a38733f8cf7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.008049] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Created folder: Project (adf3ce5e82744ca2ae24347f6948658f) in parent group-v690228. [ 605.008253] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating folder: Instances. Parent ref: group-v690229. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 605.008472] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ce642ea-42a5-4b36-8e8a-133ddf4ed18a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.015951] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Created folder: Instances in parent group-v690229. [ 605.016212] env[67131]: DEBUG oslo.service.loopingcall [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 605.016393] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 605.016580] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c0d92c3-f883-47c6-9e42-915705a529b1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.033623] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 605.033623] env[67131]: value = "task-3456388" [ 605.033623] env[67131]: _type = "Task" [ 605.033623] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.041591] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456388, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.422965] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Successfully created port: a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 605.545471] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456388, 'name': CreateVM_Task, 'duration_secs': 0.278365} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 605.545648] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 605.547064] env[67131]: DEBUG oslo_vmware.service [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8eb8f12-e55e-4987-a8c5-55ce493b1a7e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.555869] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.555869] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.555869] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 605.555869] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7451c077-6666-47ae-b2a7-4dab56d3ac1a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.561649] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for the task: (returnval){ [ 605.561649] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52fe6f93-faa9-8222-2bad-7793f1ce3b40" [ 605.561649] env[67131]: _type = "Task" [ 605.561649] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.570829] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52fe6f93-faa9-8222-2bad-7793f1ce3b40, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.663122] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Successfully created port: d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 605.808115] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "47856710-dd0d-4d4a-9af4-ae3db29510e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.808655] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "47856710-dd0d-4d4a-9af4-ae3db29510e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.822252] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 605.881208] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.881208] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.881208] env[67131]: INFO nova.compute.claims [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 606.033920] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb086b66-6074-4fb8-a393-13c9c56a1613 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.045203] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266f8b64-6263-4e92-b249-bfa049b44ea0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.083622] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-283b17c3-6110-48b9-9a25-88f77f5ea395 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.096554] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.096814] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 606.097090] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.097349] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.097913] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 606.099206] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1d47385-2a7c-4be5-8b44-78b2eda51c30 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.105021] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-348f3cf5-085a-483d-aba2-984a44d7c9fb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.116840] env[67131]: DEBUG nova.compute.provider_tree [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.121684] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 606.121884] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 606.122690] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4918ba1d-e181-4d71-b923-a549d848e863 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.126833] env[67131]: DEBUG nova.scheduler.client.report [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.132279] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-416cb310-c143-4283-a95d-66183b7790ce {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.137962] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for the task: (returnval){ [ 606.137962] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521c13e6-1e5b-bf82-58fe-6d08576a0b3e" [ 606.137962] env[67131]: _type = "Task" [ 606.137962] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.146184] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521c13e6-1e5b-bf82-58fe-6d08576a0b3e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.147150] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.149740] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 606.195839] env[67131]: DEBUG nova.compute.utils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 606.199253] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 606.200388] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 606.211671] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 606.314807] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 606.343555] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 606.343555] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 606.343555] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 606.343773] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 606.343773] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 606.343773] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 606.343773] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 606.343773] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 606.343976] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 606.343976] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 606.343976] env[67131]: DEBUG nova.virt.hardware [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 606.343976] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b63062c-02fc-43b7-ab7b-2f424268c5c1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.352754] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5312c327-2e1b-4968-8ffd-5eb4ec0d9ccd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.358808] env[67131]: DEBUG nova.policy [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a502fb5bb14a4a8490d24fef3aa0643c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa414adcb34d490892cd3b1abfc8a1b5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 606.647889] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 606.648163] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating directory with path [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 606.648388] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-44895127-122f-42be-abed-fc7ecd55b720 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.669041] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Created directory with path [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 606.669182] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Fetch image to [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 606.669355] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 606.670180] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6d3dcdc-f741-4d5b-8e61-42f460fde05a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.677682] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce63bccb-640b-447c-b5c2-7af0cdf9c3a3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.689979] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fa0a82f-f46d-4c4d-8225-2de261470a12 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.723876] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a230bcf5-7769-410a-b8a6-2dde434b49ed {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.730910] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-117779da-b296-41d2-b60c-fe88cd7fc035 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.758828] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 606.822755] env[67131]: DEBUG oslo_vmware.rw_handles [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 606.886377] env[67131]: DEBUG oslo_vmware.rw_handles [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 606.887511] env[67131]: DEBUG oslo_vmware.rw_handles [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 607.015153] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Successfully created port: e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 607.130311] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.130628] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.142065] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 607.200056] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.200317] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.201847] env[67131]: INFO nova.compute.claims [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 607.351054] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Successfully updated port: a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 607.359325] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a5a8f43-7ff0-4b7d-ae61-77cbbfa4c130 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.367251] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.367251] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquired lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.367251] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 607.376700] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9ed32f7-f539-4ab5-9f6d-74d449e994c1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.414472] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24651baf-d04b-47fe-a07d-24ef80a7f05e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.422698] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3dcd29f-ed82-422d-bada-294663e90adc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.438279] env[67131]: DEBUG nova.compute.provider_tree [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 607.450115] env[67131]: DEBUG nova.scheduler.client.report [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 607.466409] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.466909] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 607.515727] env[67131]: DEBUG nova.compute.utils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 607.516227] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 607.516534] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 607.536783] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 607.608114] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 607.662920] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 607.734786] env[67131]: DEBUG nova.policy [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd8d3c61878584783924c7e81d145e896', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32c37f40bfd44d399c200b288f892756', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 607.745909] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 607.746174] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 607.746328] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 607.746504] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 607.746644] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 607.746786] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 607.746986] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 607.748517] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 607.748717] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 607.749385] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 607.749385] env[67131]: DEBUG nova.virt.hardware [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 607.752100] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d7c1ee8-464e-4c4a-8a53-45009495145c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.763673] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4edcd409-4c38-4af5-b869-28a2d4fc43d9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.851223] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.851456] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.862070] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 607.918966] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.919421] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.920850] env[67131]: INFO nova.compute.claims [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 607.946645] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Successfully updated port: d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 607.959748] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.959938] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquired lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.960129] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.102222] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 608.124705] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a5682d7-8cb3-4593-a39d-364c81656b23 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.136432] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e7606c-b80d-4842-bf6f-de956321a3f3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.171063] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2ca1050-f53f-4490-a986-70c3a0abe48e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.178356] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28fa9749-66bc-46cb-bf0c-137ee4076ff6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.203659] env[67131]: DEBUG nova.compute.provider_tree [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.219440] env[67131]: DEBUG nova.scheduler.client.report [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.246387] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Updating instance_info_cache with network_info: [{"id": "a67259b0-2f81-4f65-84f1-22c07b321c02", "address": "fa:16:3e:97:49:20", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67259b0-2f", "ovs_interfaceid": "a67259b0-2f81-4f65-84f1-22c07b321c02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.251521] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.252054] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.264395] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Releasing lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.264674] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Instance network_info: |[{"id": "a67259b0-2f81-4f65-84f1-22c07b321c02", "address": "fa:16:3e:97:49:20", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67259b0-2f", "ovs_interfaceid": "a67259b0-2f81-4f65-84f1-22c07b321c02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 608.265130] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:97:49:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a67259b0-2f81-4f65-84f1-22c07b321c02', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 608.274357] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Creating folder: Project (aee7b72037a348e79ad015a4997259f9). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.275768] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-66f60b78-104c-4b97-8043-3c5397319c46 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.281605] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.281779] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.293150] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Created folder: Project (aee7b72037a348e79ad015a4997259f9) in parent group-v690228. [ 608.293491] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Creating folder: Instances. Parent ref: group-v690232. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.294311] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7221307e-51b1-4132-8473-8ad05fc6cc87 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.296484] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.307908] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Created folder: Instances in parent group-v690232. [ 608.307908] env[67131]: DEBUG oslo.service.loopingcall [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 608.308464] env[67131]: DEBUG nova.compute.utils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.309758] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 608.310227] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Not allocating networking since 'none' was specified. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 608.310394] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-70378a78-1caf-4f13-a979-14766a89eaeb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.330355] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.337873] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 608.337873] env[67131]: value = "task-3456391" [ 608.337873] env[67131]: _type = "Task" [ 608.337873] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.350415] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456391, 'name': CreateVM_Task} progress is 5%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.357706] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.358040] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.359800] env[67131]: INFO nova.compute.claims [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.423536] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.455701] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.455988] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.456352] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.456577] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.456874] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.456874] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.457114] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.457293] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.457474] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.457637] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.457823] env[67131]: DEBUG nova.virt.hardware [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.458751] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06acf7a0-c9cf-40cc-a971-d5c62f001d09 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.470548] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b71c079-1fb9-481f-9ea5-77ea834f48a8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.487240] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Instance VIF info [] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 608.492970] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Creating folder: Project (ef8ba7aea1654ae99459d123ff889b44). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.495880] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc0357c7-4743-4dbc-970f-60a364fd4bbd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.505954] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Created folder: Project (ef8ba7aea1654ae99459d123ff889b44) in parent group-v690228. [ 608.506138] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Creating folder: Instances. Parent ref: group-v690235. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.506396] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc6ce419-65e0-4a85-ba8b-02607d1daaea {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.520298] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Created folder: Instances in parent group-v690235. [ 608.520590] env[67131]: DEBUG oslo.service.loopingcall [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 608.520877] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 608.521126] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3107e50f-078d-4b2d-87b3-52817bb56b85 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.541903] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "b47e3b03-7b84-4305-a55c-577401e5acf3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.542259] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "b47e3b03-7b84-4305-a55c-577401e5acf3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.549277] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 608.549277] env[67131]: value = "task-3456394" [ 608.549277] env[67131]: _type = "Task" [ 608.549277] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.563332] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456394, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.569794] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.632332] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df91ec7-7052-4f35-9664-480f25554030 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.642304] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbfaa5ec-b918-49bc-9697-13c9c33ede23 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.646913] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.682681] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7900952-91c9-45c2-9e08-876ecf2d7835 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.690086] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6957ee7f-9711-4def-ac1b-5e9cc1728c38 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.704261] env[67131]: DEBUG nova.compute.provider_tree [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.715049] env[67131]: DEBUG nova.scheduler.client.report [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.736553] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.376s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.740235] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.741339] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.095s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.745837] env[67131]: INFO nova.compute.claims [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.802182] env[67131]: DEBUG nova.compute.utils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.807161] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.807355] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.821178] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.858148] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456391, 'name': CreateVM_Task, 'duration_secs': 0.312371} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 608.858315] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 608.920090] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.920090] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.920485] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 608.920591] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3eedc530-3d48-422c-95d0-4cf7458ef19e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.933978] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Waiting for the task: (returnval){ [ 608.933978] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]525b66c3-e922-edb7-89b7-33ed6eb53d64" [ 608.933978] env[67131]: _type = "Task" [ 608.933978] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.942391] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.950231] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.950473] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 608.950682] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.985151] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.985248] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.985719] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.985719] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.985719] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.985901] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.987393] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.987572] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.989183] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.990341] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.990341] env[67131]: DEBUG nova.virt.hardware [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.990756] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55c1f2f4-1791-44dd-bb19-2529ea502ae7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.001861] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50e9925-15b3-4181-9897-5caf793f46f9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.066934] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456394, 'name': CreateVM_Task, 'duration_secs': 0.279578} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 609.067130] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 609.067528] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.067685] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.067987] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 609.068240] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-234ee607-57ca-4ae2-92a7-9123a0a0f065 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.073512] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for the task: (returnval){ [ 609.073512] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52ea53ce-df97-7cc9-50ac-fdc30112524d" [ 609.073512] env[67131]: _type = "Task" [ 609.073512] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.090124] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.090255] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 609.090691] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.112682] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Updating instance_info_cache with network_info: [{"id": "d41dc992-39ae-474a-9412-4ad545dafa0e", "address": "fa:16:3e:7c:1b:32", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd41dc992-39", "ovs_interfaceid": "d41dc992-39ae-474a-9412-4ad545dafa0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 609.115285] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5195fc88-19bd-4c9a-9fe2-a48d90eef7fa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.122821] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-996b7d4a-dc8a-4695-82f0-623349dec844 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.164427] env[67131]: DEBUG nova.policy [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0e4b1d02497b41958ef3cad6559d17ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69f3b2f337df4724b66c563c996ed8bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 609.166602] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dd7786d-c6d2-4d2b-bf2d-7fa0a5b850f3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.169348] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Releasing lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.169704] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance network_info: |[{"id": "d41dc992-39ae-474a-9412-4ad545dafa0e", "address": "fa:16:3e:7c:1b:32", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd41dc992-39", "ovs_interfaceid": "d41dc992-39ae-474a-9412-4ad545dafa0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 609.170508] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7c:1b:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd41dc992-39ae-474a-9412-4ad545dafa0e', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 609.178931] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Creating folder: Project (70c482ddac1f489e8432dfd3e2b67dfd). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.179335] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f4341da5-d86d-4033-93d7-274aa7751a23 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.185097] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf53742-042b-436c-b168-008b3b9a6f85 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.194083] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Created folder: Project (70c482ddac1f489e8432dfd3e2b67dfd) in parent group-v690228. [ 609.194361] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Creating folder: Instances. Parent ref: group-v690238. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 609.194976] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8b0947d-3eed-463c-8f72-c697e24a754c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.206526] env[67131]: DEBUG nova.compute.provider_tree [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 609.217024] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Created folder: Instances in parent group-v690238. [ 609.217024] env[67131]: DEBUG oslo.service.loopingcall [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 609.217024] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 609.217024] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5a3fa362-b3bb-47e8-8103-775c17337da2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.237168] env[67131]: DEBUG nova.scheduler.client.report [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 609.244961] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 609.244961] env[67131]: value = "task-3456397" [ 609.244961] env[67131]: _type = "Task" [ 609.244961] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.253972] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456397, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 609.257741] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.259368] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 609.291956] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Successfully created port: 383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 609.324544] env[67131]: DEBUG nova.compute.utils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 609.326589] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 609.326795] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 609.330201] env[67131]: DEBUG nova.compute.manager [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Received event network-vif-plugged-a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 609.330395] env[67131]: DEBUG oslo_concurrency.lockutils [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] Acquiring lock "e55f1592-024d-431d-b3a9-63b27513cac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.330591] env[67131]: DEBUG oslo_concurrency.lockutils [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] Lock "e55f1592-024d-431d-b3a9-63b27513cac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.330748] env[67131]: DEBUG oslo_concurrency.lockutils [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] Lock "e55f1592-024d-431d-b3a9-63b27513cac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.330960] env[67131]: DEBUG nova.compute.manager [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] No waiting events found dispatching network-vif-plugged-a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.331102] env[67131]: WARNING nova.compute.manager [req-f03b34b2-5bd2-4837-a6b2-46e75ff5345c req-dd9a0034-6aab-454b-aaf5-ef27b3e87e7d service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Received unexpected event network-vif-plugged-a67259b0-2f81-4f65-84f1-22c07b321c02 for instance with vm_state building and task_state spawning. [ 609.349909] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 609.374046] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Successfully updated port: e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 609.386041] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.386041] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquired lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.386041] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 609.450171] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 609.483703] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 609.483951] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 609.484121] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 609.484307] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 609.484452] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 609.484594] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 609.484872] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 609.484961] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 609.486243] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 609.486442] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 609.487288] env[67131]: DEBUG nova.virt.hardware [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 609.487807] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1c48eb9-cae4-4048-965d-b63b7c49a388 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.500170] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1c9f12d-0579-46e7-bee2-adc803894d80 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.520453] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 609.635863] env[67131]: DEBUG nova.policy [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74f866f97d9a43ea955f63d5d63448a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a242165d37148439e59502ff3c2f6c2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 609.763150] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456397, 'name': CreateVM_Task, 'duration_secs': 0.281782} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 609.763150] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 609.763150] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.763150] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.763462] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 609.763462] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-07f9fa6e-b019-4c92-9789-8d57fdca33f8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.770106] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Waiting for the task: (returnval){ [ 609.770106] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529e8535-e463-f2d2-924f-502c98b16e94" [ 609.770106] env[67131]: _type = "Task" [ 609.770106] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.782864] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529e8535-e463-f2d2-924f-502c98b16e94, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 609.889291] env[67131]: DEBUG nova.compute.manager [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Received event network-vif-plugged-d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 609.889458] env[67131]: DEBUG oslo_concurrency.lockutils [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] Acquiring lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.889733] env[67131]: DEBUG oslo_concurrency.lockutils [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] Lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.889843] env[67131]: DEBUG oslo_concurrency.lockutils [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] Lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.890082] env[67131]: DEBUG nova.compute.manager [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] No waiting events found dispatching network-vif-plugged-d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.891498] env[67131]: WARNING nova.compute.manager [req-7b3aa371-c334-4a2d-8b53-617f4da7039e req-22131863-f841-477c-a228-ddad18de0722 service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Received unexpected event network-vif-plugged-d41dc992-39ae-474a-9412-4ad545dafa0e for instance with vm_state building and task_state spawning. [ 610.291304] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.292176] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 610.292176] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.378969] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Updating instance_info_cache with network_info: [{"id": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "address": "fa:16:3e:a1:da:f0", "network": {"id": "7ec24047-b138-457e-8974-206f5fe83d7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1591500436-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa414adcb34d490892cd3b1abfc8a1b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape0fe0f11-70", "ovs_interfaceid": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 610.392860] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Releasing lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.393325] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance network_info: |[{"id": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "address": "fa:16:3e:a1:da:f0", "network": {"id": "7ec24047-b138-457e-8974-206f5fe83d7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1591500436-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa414adcb34d490892cd3b1abfc8a1b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape0fe0f11-70", "ovs_interfaceid": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 610.393615] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:da:f0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c8459aaf-d6a8-46fb-ad14-464ac3104695', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e0fe0f11-70fb-43f4-b998-84f21897f4f2', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 610.401670] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Creating folder: Project (aa414adcb34d490892cd3b1abfc8a1b5). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.401954] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e755034-376e-474c-90a7-7f1fa0f6db3c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.415337] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Created folder: Project (aa414adcb34d490892cd3b1abfc8a1b5) in parent group-v690228. [ 610.415337] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Creating folder: Instances. Parent ref: group-v690241. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 610.415337] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-19acd696-9aaf-475b-b066-f7822431df50 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.424414] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Created folder: Instances in parent group-v690241. [ 610.424414] env[67131]: DEBUG oslo.service.loopingcall [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 610.424414] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 610.426130] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3576f038-e8a8-4c3f-ad65-bb9c95d86881 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.450970] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 610.450970] env[67131]: value = "task-3456400" [ 610.450970] env[67131]: _type = "Task" [ 610.450970] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 610.466721] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456400, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 610.578590] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Successfully created port: 1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 610.883850] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.884123] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.897747] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 610.905197] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Successfully created port: fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 610.962650] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456400, 'name': CreateVM_Task, 'duration_secs': 0.313833} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 610.962650] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 610.963098] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.963282] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.963576] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 610.963801] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f3279ce7-775b-47ab-801c-80449ef70130 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.967447] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.967447] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.971124] env[67131]: INFO nova.compute.claims [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 610.978106] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Waiting for the task: (returnval){ [ 610.978106] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52d74ce2-0c68-a6a2-0d19-d41afb940d0b" [ 610.978106] env[67131]: _type = "Task" [ 610.978106] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 610.995314] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 610.995583] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 610.995825] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.229052] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-285967ec-5f2e-4641-9583-37cd55e17e1e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.237023] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3c7cc7-15fe-4cf7-9512-ea6e21f9e6fc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.270984] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5852cbe-9eca-486f-a8bd-1a870473d151 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.278359] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-999457f5-3542-4cd0-88a7-80f1df774feb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.291919] env[67131]: DEBUG nova.compute.provider_tree [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.306777] env[67131]: DEBUG nova.scheduler.client.report [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.322764] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.323931] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 611.365682] env[67131]: DEBUG nova.compute.utils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.366795] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 611.366967] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 611.382423] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 611.397042] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Successfully updated port: 383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 611.412363] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.412452] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquired lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.412535] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 611.492413] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 611.531119] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 611.531314] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 611.531462] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 611.531634] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 611.531768] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 611.531905] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 611.532149] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 611.532409] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 611.532469] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 611.532631] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 611.532810] env[67131]: DEBUG nova.virt.hardware [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 611.534169] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c3544c5-9929-4686-9792-91eecc4911c0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.545352] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee39016c-32f8-4b39-9ae9-af57d6fb7c7d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.683501] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 611.849456] env[67131]: DEBUG nova.policy [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ecddbfb492e84def94bc97a2840f9936', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '63d6ed8d60094feab63587e063979d5c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 612.327304] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Successfully updated port: 1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 612.342619] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.342733] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.342883] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 612.360124] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Updating instance_info_cache with network_info: [{"id": "383b824e-b71e-46ba-bbfa-c35640e95719", "address": "fa:16:3e:ee:b3:77", "network": {"id": "62196244-084d-44d2-8b00-7d0f4146752d", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1462018196-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32c37f40bfd44d399c200b288f892756", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap383b824e-b7", "ovs_interfaceid": "383b824e-b71e-46ba-bbfa-c35640e95719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.385431] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Releasing lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.385739] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance network_info: |[{"id": "383b824e-b71e-46ba-bbfa-c35640e95719", "address": "fa:16:3e:ee:b3:77", "network": {"id": "62196244-084d-44d2-8b00-7d0f4146752d", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1462018196-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32c37f40bfd44d399c200b288f892756", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap383b824e-b7", "ovs_interfaceid": "383b824e-b71e-46ba-bbfa-c35640e95719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 612.386288] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:b3:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bec1528b-3e87-477b-8ab2-02696ad47e66', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '383b824e-b71e-46ba-bbfa-c35640e95719', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 612.404319] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Creating folder: Project (32c37f40bfd44d399c200b288f892756). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.404319] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 612.406186] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d29be028-7769-4f0b-9160-756eda84ce29 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.420582] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Created folder: Project (32c37f40bfd44d399c200b288f892756) in parent group-v690228. [ 612.420799] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Creating folder: Instances. Parent ref: group-v690244. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.421043] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1545d30-f6b8-4847-8713-6952b18b141b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.430202] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Created folder: Instances in parent group-v690244. [ 612.430437] env[67131]: DEBUG oslo.service.loopingcall [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 612.430607] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 612.430834] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-914736b0-692a-4075-b8d3-646ccef8bb35 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.454731] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 612.454731] env[67131]: value = "task-3456403" [ 612.454731] env[67131]: _type = "Task" [ 612.454731] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.462304] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456403, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.679637] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Successfully created port: e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 612.698674] env[67131]: DEBUG nova.compute.manager [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Received event network-changed-a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 612.698962] env[67131]: DEBUG nova.compute.manager [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Refreshing instance network info cache due to event network-changed-a67259b0-2f81-4f65-84f1-22c07b321c02. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 612.699232] env[67131]: DEBUG oslo_concurrency.lockutils [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] Acquiring lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.699385] env[67131]: DEBUG oslo_concurrency.lockutils [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] Acquired lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.699546] env[67131]: DEBUG nova.network.neutron [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Refreshing network info cache for port a67259b0-2f81-4f65-84f1-22c07b321c02 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 612.709961] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Updating instance_info_cache with network_info: [{"id": "1a66cd40-0c6b-463a-942a-b066de9752e5", "address": "fa:16:3e:66:e1:f3", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a66cd40-0c", "ovs_interfaceid": "1a66cd40-0c6b-463a-942a-b066de9752e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.724294] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.724832] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance network_info: |[{"id": "1a66cd40-0c6b-463a-942a-b066de9752e5", "address": "fa:16:3e:66:e1:f3", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a66cd40-0c", "ovs_interfaceid": "1a66cd40-0c6b-463a-942a-b066de9752e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 612.724964] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:66:e1:f3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1a66cd40-0c6b-463a-942a-b066de9752e5', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 612.733266] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating folder: Project (69f3b2f337df4724b66c563c996ed8bb). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.733880] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31668899-625d-4252-bbe1-7f1ad403bdc8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.744762] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created folder: Project (69f3b2f337df4724b66c563c996ed8bb) in parent group-v690228. [ 612.747016] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating folder: Instances. Parent ref: group-v690247. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.747016] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a5287c9c-5032-47e5-91f8-6df7243662b7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.755083] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created folder: Instances in parent group-v690247. [ 612.755816] env[67131]: DEBUG oslo.service.loopingcall [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 612.755816] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 612.755816] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-258fee1c-502e-4b81-a970-d99f0e065f26 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.782476] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 612.782476] env[67131]: value = "task-3456406" [ 612.782476] env[67131]: _type = "Task" [ 612.782476] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.791355] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456406, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.925220] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Successfully updated port: fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 612.942069] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.942321] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquired lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.945338] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 612.975658] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456403, 'name': CreateVM_Task, 'duration_secs': 0.291191} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 612.975852] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 612.976561] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.976711] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.977045] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 612.977301] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-338efd60-8628-4da0-9826-db363cc3d354 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.982697] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Waiting for the task: (returnval){ [ 612.982697] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522a3740-f548-dd7b-e9a9-6a2e07b9fe10" [ 612.982697] env[67131]: _type = "Task" [ 612.982697] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.994487] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522a3740-f548-dd7b-e9a9-6a2e07b9fe10, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.078752] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 613.293927] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456406, 'name': CreateVM_Task, 'duration_secs': 0.315559} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 613.293927] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 613.294791] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.495837] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.495837] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 613.495837] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.495837] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.496426] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 613.496757] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ceeab65a-bec1-4bb2-ac05-52701dc2e74d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.507460] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 613.507460] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]520b3398-21de-d056-5b35-0bb738d2ce3d" [ 613.507460] env[67131]: _type = "Task" [ 613.507460] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.518106] env[67131]: DEBUG nova.network.neutron [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Updated VIF entry in instance network info cache for port a67259b0-2f81-4f65-84f1-22c07b321c02. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 613.518431] env[67131]: DEBUG nova.network.neutron [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Updating instance_info_cache with network_info: [{"id": "a67259b0-2f81-4f65-84f1-22c07b321c02", "address": "fa:16:3e:97:49:20", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.165", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa67259b0-2f", "ovs_interfaceid": "a67259b0-2f81-4f65-84f1-22c07b321c02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.519699] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.520419] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 613.520419] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.532128] env[67131]: DEBUG oslo_concurrency.lockutils [req-4cecc3f8-17ab-447a-ab55-9273965d5901 req-b8d69076-8e89-4755-9545-ad30e4762fed service nova] Releasing lock "refresh_cache-e55f1592-024d-431d-b3a9-63b27513cac4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.665431] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Updating instance_info_cache with network_info: [{"id": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "address": "fa:16:3e:9c:56:b5", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe2e0110-82", "ovs_interfaceid": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.679020] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Releasing lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.679020] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance network_info: |[{"id": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "address": "fa:16:3e:9c:56:b5", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe2e0110-82", "ovs_interfaceid": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 613.679387] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9c:56:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fe2e0110-8252-4bb0-bf22-3534de8f8fe2', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 613.685454] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Creating folder: Project (4a242165d37148439e59502ff3c2f6c2). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 613.686154] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-90867071-e2ce-48e2-87cf-f636274f6af9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.700181] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Created folder: Project (4a242165d37148439e59502ff3c2f6c2) in parent group-v690228. [ 613.700181] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Creating folder: Instances. Parent ref: group-v690250. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 613.700181] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0edb7611-4b22-43bb-9786-d76d18405900 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.708510] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Created folder: Instances in parent group-v690250. [ 613.709500] env[67131]: DEBUG oslo.service.loopingcall [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 613.709500] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 613.709500] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-88c56cc0-a0ca-4037-a68b-38fe3c6d4831 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.741646] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 613.741646] env[67131]: value = "task-3456409" [ 613.741646] env[67131]: _type = "Task" [ 613.741646] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.756556] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456409, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.803029] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Received event network-changed-d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 613.803029] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Refreshing instance network info cache due to event network-changed-d41dc992-39ae-474a-9412-4ad545dafa0e. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 613.803029] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquiring lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.803029] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquired lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.803029] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Refreshing network info cache for port d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.239372] env[67131]: DEBUG nova.compute.manager [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Received event network-vif-plugged-fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 614.239372] env[67131]: DEBUG oslo_concurrency.lockutils [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] Acquiring lock "b47e3b03-7b84-4305-a55c-577401e5acf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.239372] env[67131]: DEBUG oslo_concurrency.lockutils [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] Lock "b47e3b03-7b84-4305-a55c-577401e5acf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.239372] env[67131]: DEBUG oslo_concurrency.lockutils [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] Lock "b47e3b03-7b84-4305-a55c-577401e5acf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.239511] env[67131]: DEBUG nova.compute.manager [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] No waiting events found dispatching network-vif-plugged-fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 614.239511] env[67131]: WARNING nova.compute.manager [req-2d783917-3760-40b5-878e-cf2f2eecbe7e req-7281410d-9959-48df-8522-e0d489815d5f service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Received unexpected event network-vif-plugged-fe2e0110-8252-4bb0-bf22-3534de8f8fe2 for instance with vm_state building and task_state spawning. [ 614.256932] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456409, 'name': CreateVM_Task, 'duration_secs': 0.30008} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 614.256932] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 614.256932] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.256932] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.256932] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 614.257209] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4109e4bf-0e0f-4793-9fd0-0ef0145e99d3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.262018] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Waiting for the task: (returnval){ [ 614.262018] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52de4c3c-80e1-d374-a0d6-5878d0e5ade7" [ 614.262018] env[67131]: _type = "Task" [ 614.262018] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 614.268620] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52de4c3c-80e1-d374-a0d6-5878d0e5ade7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 614.634718] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Successfully updated port: e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.648296] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.648869] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquired lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.652013] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.693960] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Updated VIF entry in instance network info cache for port d41dc992-39ae-474a-9412-4ad545dafa0e. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 614.694578] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Updating instance_info_cache with network_info: [{"id": "d41dc992-39ae-474a-9412-4ad545dafa0e", "address": "fa:16:3e:7c:1b:32", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd41dc992-39", "ovs_interfaceid": "d41dc992-39ae-474a-9412-4ad545dafa0e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.705950] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Releasing lock "refresh_cache-c9a491fe-aff4-4b4f-bcfb-dd56f1010576" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.706743] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Received event network-vif-plugged-e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 614.706743] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquiring lock "47856710-dd0d-4d4a-9af4-ae3db29510e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.706743] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Lock "47856710-dd0d-4d4a-9af4-ae3db29510e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.706743] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Lock "47856710-dd0d-4d4a-9af4-ae3db29510e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.707065] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] No waiting events found dispatching network-vif-plugged-e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 614.707065] env[67131]: WARNING nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Received unexpected event network-vif-plugged-e0fe0f11-70fb-43f4-b998-84f21897f4f2 for instance with vm_state building and task_state spawning. [ 614.707281] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Received event network-changed-e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 614.707367] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Refreshing instance network info cache due to event network-changed-e0fe0f11-70fb-43f4-b998-84f21897f4f2. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 614.707550] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquiring lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.707702] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquired lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.707881] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Refreshing network info cache for port e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 614.709974] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.775952] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.776618] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 614.776618] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.263250] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Updating instance_info_cache with network_info: [{"id": "e979a923-3f4d-4035-9075-90dcf514b341", "address": "fa:16:3e:10:04:6a", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape979a923-3f", "ovs_interfaceid": "e979a923-3f4d-4035-9075-90dcf514b341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.278229] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Releasing lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.278533] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance network_info: |[{"id": "e979a923-3f4d-4035-9075-90dcf514b341", "address": "fa:16:3e:10:04:6a", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape979a923-3f", "ovs_interfaceid": "e979a923-3f4d-4035-9075-90dcf514b341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 615.278967] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:10:04:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e979a923-3f4d-4035-9075-90dcf514b341', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 615.287421] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Creating folder: Project (63d6ed8d60094feab63587e063979d5c). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.288108] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-588455cf-35d0-4fba-bcbd-7cd773757b4d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.301689] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Created folder: Project (63d6ed8d60094feab63587e063979d5c) in parent group-v690228. [ 615.301900] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Creating folder: Instances. Parent ref: group-v690253. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.302156] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-89dc837a-d058-4e66-bd33-9234a93bcd69 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.311187] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Created folder: Instances in parent group-v690253. [ 615.311437] env[67131]: DEBUG oslo.service.loopingcall [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 615.311599] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 615.311815] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fce5a34f-213d-4af0-ba15-1d88d01a61f5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.338455] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 615.338455] env[67131]: value = "task-3456412" [ 615.338455] env[67131]: _type = "Task" [ 615.338455] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.346473] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456412, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.712798] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Updated VIF entry in instance network info cache for port e0fe0f11-70fb-43f4-b998-84f21897f4f2. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 615.713346] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Updating instance_info_cache with network_info: [{"id": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "address": "fa:16:3e:a1:da:f0", "network": {"id": "7ec24047-b138-457e-8974-206f5fe83d7f", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1591500436-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa414adcb34d490892cd3b1abfc8a1b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape0fe0f11-70", "ovs_interfaceid": "e0fe0f11-70fb-43f4-b998-84f21897f4f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.725197] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Releasing lock "refresh_cache-47856710-dd0d-4d4a-9af4-ae3db29510e9" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.725197] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Received event network-vif-plugged-383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 615.725197] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquiring lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.725363] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.725474] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.725639] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] No waiting events found dispatching network-vif-plugged-383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 615.725804] env[67131]: WARNING nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Received unexpected event network-vif-plugged-383b824e-b71e-46ba-bbfa-c35640e95719 for instance with vm_state building and task_state spawning. [ 615.725975] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Received event network-changed-383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 615.726161] env[67131]: DEBUG nova.compute.manager [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Refreshing instance network info cache due to event network-changed-383b824e-b71e-46ba-bbfa-c35640e95719. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 615.726344] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquiring lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.726480] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Acquired lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.726636] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Refreshing network info cache for port 383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 615.854967] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456412, 'name': CreateVM_Task, 'duration_secs': 0.27868} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 615.855462] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.860696] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.860780] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.861109] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 615.861367] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ee1e6d6-9739-410a-ba63-21d05c4ec95b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.867693] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for the task: (returnval){ [ 615.867693] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5249cdd3-ed00-4d89-453a-3a6e127d2813" [ 615.867693] env[67131]: _type = "Task" [ 615.867693] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.878169] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5249cdd3-ed00-4d89-453a-3a6e127d2813, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 616.067713] env[67131]: DEBUG nova.compute.manager [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Received event network-vif-plugged-1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 616.067915] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Acquiring lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.068134] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.068322] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.068484] env[67131]: DEBUG nova.compute.manager [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] No waiting events found dispatching network-vif-plugged-1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 616.068716] env[67131]: WARNING nova.compute.manager [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Received unexpected event network-vif-plugged-1a66cd40-0c6b-463a-942a-b066de9752e5 for instance with vm_state building and task_state spawning. [ 616.068894] env[67131]: DEBUG nova.compute.manager [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Received event network-changed-1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 616.069200] env[67131]: DEBUG nova.compute.manager [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Refreshing instance network info cache due to event network-changed-1a66cd40-0c6b-463a-942a-b066de9752e5. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 616.069462] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Acquiring lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.069609] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Acquired lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 616.069821] env[67131]: DEBUG nova.network.neutron [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Refreshing network info cache for port 1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 616.281385] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Updated VIF entry in instance network info cache for port 383b824e-b71e-46ba-bbfa-c35640e95719. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 616.281724] env[67131]: DEBUG nova.network.neutron [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Updating instance_info_cache with network_info: [{"id": "383b824e-b71e-46ba-bbfa-c35640e95719", "address": "fa:16:3e:ee:b3:77", "network": {"id": "62196244-084d-44d2-8b00-7d0f4146752d", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1462018196-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32c37f40bfd44d399c200b288f892756", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bec1528b-3e87-477b-8ab2-02696ad47e66", "external-id": "nsx-vlan-transportzone-180", "segmentation_id": 180, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap383b824e-b7", "ovs_interfaceid": "383b824e-b71e-46ba-bbfa-c35640e95719", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.291836] env[67131]: DEBUG oslo_concurrency.lockutils [req-7145c723-6b6d-4bd1-bcdc-185c758abf66 req-f10ea8de-f32a-4e05-bbbd-837fc3e2442e service nova] Releasing lock "refresh_cache-f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.384787] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.385453] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 616.385952] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.730107] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Received event network-changed-fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 616.730107] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Refreshing instance network info cache due to event network-changed-fe2e0110-8252-4bb0-bf22-3534de8f8fe2. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 616.730107] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Acquiring lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.730107] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Acquired lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 616.730107] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Refreshing network info cache for port fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 616.874158] env[67131]: DEBUG nova.network.neutron [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Updated VIF entry in instance network info cache for port 1a66cd40-0c6b-463a-942a-b066de9752e5. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 616.876383] env[67131]: DEBUG nova.network.neutron [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Updating instance_info_cache with network_info: [{"id": "1a66cd40-0c6b-463a-942a-b066de9752e5", "address": "fa:16:3e:66:e1:f3", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.235", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1a66cd40-0c", "ovs_interfaceid": "1a66cd40-0c6b-463a-942a-b066de9752e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.892656] env[67131]: DEBUG oslo_concurrency.lockutils [req-5a7bced7-e53b-458e-8d1d-14bb60b34207 req-5885b718-7e59-4864-bf44-01dbb0be9c6c service nova] Releasing lock "refresh_cache-fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 617.601027] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Updated VIF entry in instance network info cache for port fe2e0110-8252-4bb0-bf22-3534de8f8fe2. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 617.601445] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Updating instance_info_cache with network_info: [{"id": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "address": "fa:16:3e:9c:56:b5", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe2e0110-82", "ovs_interfaceid": "fe2e0110-8252-4bb0-bf22-3534de8f8fe2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.619037] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Releasing lock "refresh_cache-b47e3b03-7b84-4305-a55c-577401e5acf3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 617.619037] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Received event network-vif-plugged-e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 617.619037] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Acquiring lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.619037] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.619504] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.619504] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] No waiting events found dispatching network-vif-plugged-e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 617.619504] env[67131]: WARNING nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Received unexpected event network-vif-plugged-e979a923-3f4d-4035-9075-90dcf514b341 for instance with vm_state building and task_state spawning. [ 617.619504] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Received event network-changed-e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 617.619739] env[67131]: DEBUG nova.compute.manager [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Refreshing instance network info cache due to event network-changed-e979a923-3f4d-4035-9075-90dcf514b341. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 617.619739] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Acquiring lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 617.619739] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Acquired lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 617.619739] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Refreshing network info cache for port e979a923-3f4d-4035-9075-90dcf514b341 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 618.229758] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.230938] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.230938] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 618.230938] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 618.253479] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.253637] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.255643] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.255748] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.255845] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.255986] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.256129] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.256254] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.256368] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.256484] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 618.257027] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.257329] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.257460] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.257705] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.257898] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.258024] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.258412] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 618.258412] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.270743] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.270977] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.271169] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.271340] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 618.272401] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc514fb4-b0d6-41cc-8764-b008a0801a49 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.286816] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b795d559-94c6-4ca9-8582-0b62ce15f0fb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.309019] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82f58f21-6fa5-4909-b1e5-0ddbd5f7fa05 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.309349] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Updated VIF entry in instance network info cache for port e979a923-3f4d-4035-9075-90dcf514b341. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 618.309659] env[67131]: DEBUG nova.network.neutron [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Updating instance_info_cache with network_info: [{"id": "e979a923-3f4d-4035-9075-90dcf514b341", "address": "fa:16:3e:10:04:6a", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.182", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape979a923-3f", "ovs_interfaceid": "e979a923-3f4d-4035-9075-90dcf514b341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 618.315711] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb01bb1-bf22-41f4-be6c-567fa4894aa4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.353377] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180918MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 618.353529] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.353724] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.355544] env[67131]: DEBUG oslo_concurrency.lockutils [req-91124354-affb-428c-80b1-bbb6b5c30ab7 req-de020011-7172-4f7d-ae79-e346f6d1e08d service nova] Releasing lock "refresh_cache-39b67ef8-fce0-4bf3-b161-b5fbd588214b" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 618.430566] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c9a491fe-aff4-4b4f-bcfb-dd56f1010576 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.430730] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance e55f1592-024d-431d-b3a9-63b27513cac4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.430860] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.430982] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 47856710-dd0d-4d4a-9af4-ae3db29510e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431187] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431270] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431391] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431509] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b47e3b03-7b84-4305-a55c-577401e5acf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431628] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.431824] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 618.431958] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 618.589128] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5791fc6d-a99f-43f5-bff6-4da26f384390 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.598017] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c66c2e6b-f002-49e0-9ce9-664a88391a37 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.639131] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59d37fd3-71c8-4895-bc29-3c2b728d0526 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.647467] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98fb5b4f-a751-4736-a3fd-400943861354 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.664920] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.675374] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.704627] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 618.704825] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.855403] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "28bf23c6-d36a-4822-9569-c825a7366ed4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.857874] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.871882] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 624.945240] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.945483] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.946973] env[67131]: INFO nova.compute.claims [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.201179] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41f92663-2297-4051-ba11-d6870c4584f8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.211024] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bacd6a7e-27e5-41ae-af4a-b33440ffe11e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.251500] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38f4835c-bc6b-4d15-99f6-edb27c6e06bb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.258970] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-020c60bf-e6a0-407f-b036-9efa3f266652 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.277350] env[67131]: DEBUG nova.compute.provider_tree [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.288944] env[67131]: DEBUG nova.scheduler.client.report [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.333203] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.337311] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 625.377509] env[67131]: DEBUG nova.compute.utils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 625.378882] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 625.379219] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 625.390515] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 625.481891] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 625.513377] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.513748] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.514535] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.514535] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.514692] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.514785] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.515081] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.515287] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.515467] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.515639] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.515922] env[67131]: DEBUG nova.virt.hardware [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 625.516871] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9aaf66-d1bb-4b85-b394-e66342851ab4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.527032] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff4e371-84c6-4a10-9d56-6ce2e7ecbe95 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.549956] env[67131]: DEBUG nova.policy [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a023c907aa44f0b8c1daa2b1cfef0e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1e00dd5864e34a54aa651d365dc8f27e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 626.167656] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.168293] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.662700] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Successfully created port: 3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 627.111999] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.112268] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.034272] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Successfully updated port: 3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 629.051523] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.051682] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquired lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.051853] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.201894] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.300229] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Updating instance_info_cache with network_info: [{"id": "3b78f359-3d96-44cc-be0c-d87d96349be3", "address": "fa:16:3e:df:44:80", "network": {"id": "1a89806a-ccf6-4085-80bb-2818321f07ae", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-464326195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e00dd5864e34a54aa651d365dc8f27e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1e029825-6c65-4ac7-88f6-65f9d106db76", "external-id": "nsx-vlan-transportzone-428", "segmentation_id": 428, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b78f359-3d", "ovs_interfaceid": "3b78f359-3d96-44cc-be0c-d87d96349be3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.319683] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Releasing lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.319683] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance network_info: |[{"id": "3b78f359-3d96-44cc-be0c-d87d96349be3", "address": "fa:16:3e:df:44:80", "network": {"id": "1a89806a-ccf6-4085-80bb-2818321f07ae", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-464326195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e00dd5864e34a54aa651d365dc8f27e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1e029825-6c65-4ac7-88f6-65f9d106db76", "external-id": "nsx-vlan-transportzone-428", "segmentation_id": 428, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b78f359-3d", "ovs_interfaceid": "3b78f359-3d96-44cc-be0c-d87d96349be3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 630.320579] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:44:80', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1e029825-6c65-4ac7-88f6-65f9d106db76', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3b78f359-3d96-44cc-be0c-d87d96349be3', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 630.330025] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Creating folder: Project (1e00dd5864e34a54aa651d365dc8f27e). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.330271] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-15df3ff3-28ec-4f21-b767-476fb7037d39 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.346135] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Created folder: Project (1e00dd5864e34a54aa651d365dc8f27e) in parent group-v690228. [ 630.346135] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Creating folder: Instances. Parent ref: group-v690256. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.346135] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94e7cb0c-f2b4-4c5d-a2a7-22155af4b93b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.355864] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Created folder: Instances in parent group-v690256. [ 630.357388] env[67131]: DEBUG oslo.service.loopingcall [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 630.357388] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 630.357388] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cadc61f5-5439-4f14-a8e9-3958743cdb99 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.386888] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 630.386888] env[67131]: value = "task-3456415" [ 630.386888] env[67131]: _type = "Task" [ 630.386888] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 630.395150] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456415, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 630.604839] env[67131]: DEBUG nova.compute.manager [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Received event network-vif-plugged-3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 630.605075] env[67131]: DEBUG oslo_concurrency.lockutils [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] Acquiring lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.605281] env[67131]: DEBUG oslo_concurrency.lockutils [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.605457] env[67131]: DEBUG oslo_concurrency.lockutils [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.605611] env[67131]: DEBUG nova.compute.manager [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] No waiting events found dispatching network-vif-plugged-3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 630.605766] env[67131]: WARNING nova.compute.manager [req-1702dee8-847f-464a-b3a5-300563cab384 req-8eba73f2-6269-47cc-8d56-5e05fdca4a8e service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Received unexpected event network-vif-plugged-3b78f359-3d96-44cc-be0c-d87d96349be3 for instance with vm_state building and task_state spawning. [ 630.899211] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456415, 'name': CreateVM_Task, 'duration_secs': 0.30664} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 630.899459] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 630.900242] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.900403] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.900953] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 630.901219] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ecb0c843-6f6a-4548-9a30-8e1a09a9895b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.906350] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for the task: (returnval){ [ 630.906350] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52e1b9c5-d803-6dcd-b822-06a8a6c03b38" [ 630.906350] env[67131]: _type = "Task" [ 630.906350] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 630.916357] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52e1b9c5-d803-6dcd-b822-06a8a6c03b38, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.419438] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.419438] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 631.419438] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.511280] env[67131]: DEBUG nova.compute.manager [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Received event network-changed-3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 636.511280] env[67131]: DEBUG nova.compute.manager [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Refreshing instance network info cache due to event network-changed-3b78f359-3d96-44cc-be0c-d87d96349be3. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 636.511877] env[67131]: DEBUG oslo_concurrency.lockutils [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] Acquiring lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.511877] env[67131]: DEBUG oslo_concurrency.lockutils [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] Acquired lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.511877] env[67131]: DEBUG nova.network.neutron [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Refreshing network info cache for port 3b78f359-3d96-44cc-be0c-d87d96349be3 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 638.125622] env[67131]: DEBUG nova.network.neutron [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Updated VIF entry in instance network info cache for port 3b78f359-3d96-44cc-be0c-d87d96349be3. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 638.125959] env[67131]: DEBUG nova.network.neutron [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Updating instance_info_cache with network_info: [{"id": "3b78f359-3d96-44cc-be0c-d87d96349be3", "address": "fa:16:3e:df:44:80", "network": {"id": "1a89806a-ccf6-4085-80bb-2818321f07ae", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-464326195-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1e00dd5864e34a54aa651d365dc8f27e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1e029825-6c65-4ac7-88f6-65f9d106db76", "external-id": "nsx-vlan-transportzone-428", "segmentation_id": 428, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b78f359-3d", "ovs_interfaceid": "3b78f359-3d96-44cc-be0c-d87d96349be3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.140850] env[67131]: DEBUG oslo_concurrency.lockutils [req-5466cfc6-be48-4ea3-a100-2ebbf57211cc req-c95875fb-ef12-4a85-a180-1c4b25b67cfa service nova] Releasing lock "refresh_cache-28bf23c6-d36a-4822-9569-c825a7366ed4" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 654.660174] env[67131]: WARNING oslo_vmware.rw_handles [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 654.660174] env[67131]: ERROR oslo_vmware.rw_handles [ 654.661021] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 654.662044] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 654.662285] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Copying Virtual Disk [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/45b2c351-20da-457e-aafe-286242463b62/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 654.662569] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-63b0a5a4-55a8-4d34-86b4-f0b647464c0e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.672374] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for the task: (returnval){ [ 654.672374] env[67131]: value = "task-3456416" [ 654.672374] env[67131]: _type = "Task" [ 654.672374] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 654.683325] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Task: {'id': task-3456416, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 655.183936] env[67131]: DEBUG oslo_vmware.exceptions [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 655.184236] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 655.187970] env[67131]: ERROR nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 655.187970] env[67131]: Faults: ['InvalidArgument'] [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Traceback (most recent call last): [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] yield resources [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self.driver.spawn(context, instance, image_meta, [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self._fetch_image_if_missing(context, vi) [ 655.187970] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] image_cache(vi, tmp_image_ds_loc) [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] vm_util.copy_virtual_disk( [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] session._wait_for_task(vmdk_copy_task) [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return self.wait_for_task(task_ref) [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return evt.wait() [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] result = hub.switch() [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 655.188447] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return self.greenlet.switch() [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self.f(*self.args, **self.kw) [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] raise exceptions.translate_fault(task_info.error) [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Faults: ['InvalidArgument'] [ 655.188827] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] [ 655.188827] env[67131]: INFO nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Terminating instance [ 655.191193] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 655.191436] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.192167] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 655.192385] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquired lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 655.192627] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 655.193709] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-afe89eba-0c13-409f-8a40-a7015c3afb03 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.203870] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.203870] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 655.208492] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef65af08-a3fe-4f59-bf13-11d9208a70bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.215810] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Waiting for the task: (returnval){ [ 655.215810] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521a1f75-8927-e7c0-4ae5-079033a98054" [ 655.215810] env[67131]: _type = "Task" [ 655.215810] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 655.227568] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521a1f75-8927-e7c0-4ae5-079033a98054, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 655.279575] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.730088] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 655.730391] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Creating directory with path [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.730608] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-400b4d0c-048f-4651-8f64-77e3922c791f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.753893] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Created directory with path [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.755406] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Fetch image to [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 655.755406] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 655.755406] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a11ec3b8-355e-47fc-b9bf-35c91c6aa44f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.764578] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.766134] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d6461dc-65c1-46f7-b6ef-38d18d524c55 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.778825] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5422127b-53b8-4876-a2a9-6be801558d6c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.785068] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Releasing lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 655.785706] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 655.786019] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 655.787138] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef812568-c6bb-4471-8cdc-81a63ceec4be {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.822949] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-261caaba-ef14-4f34-a035-a483b89548ab {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.826088] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 655.826430] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ca124704-1716-4bec-b51f-e028b392d8a0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.832089] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6874bacb-3ed1-483b-aba6-e043b54376dd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.860765] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 655.861074] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 655.861181] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Deleting the datastore file [datastore1] d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 655.861474] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a42ae07d-2278-4034-b97b-e12eedcdac4b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.865948] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 655.873471] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for the task: (returnval){ [ 655.873471] env[67131]: value = "task-3456418" [ 655.873471] env[67131]: _type = "Task" [ 655.873471] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 655.882186] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Task: {'id': task-3456418, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 655.930111] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 655.994503] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 655.994787] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 656.384459] env[67131]: DEBUG oslo_vmware.api [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Task: {'id': task-3456418, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044208} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 656.384853] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 656.385180] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 656.385456] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 656.385963] env[67131]: INFO nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Took 0.60 seconds to destroy the instance on the hypervisor. [ 656.386330] env[67131]: DEBUG oslo.service.loopingcall [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 656.386740] env[67131]: DEBUG nova.compute.manager [-] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Skipping network deallocation for instance since networking was not requested. {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 656.389886] env[67131]: DEBUG nova.compute.claims [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 656.389886] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.390303] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.647312] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bea13128-e8d3-4e47-83e6-11d0c5ef5930 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.656294] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-527cc092-302f-40cb-bb59-c8fdc3d6a243 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.694523] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49eceeb9-abdc-4546-9cef-c0979ef0a4ee {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.701648] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d481047-1316-40fe-a5fd-e848dd547009 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.717256] env[67131]: DEBUG nova.compute.provider_tree [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 656.727268] env[67131]: DEBUG nova.scheduler.client.report [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 656.745243] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.355s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.745689] env[67131]: ERROR nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.745689] env[67131]: Faults: ['InvalidArgument'] [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Traceback (most recent call last): [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self.driver.spawn(context, instance, image_meta, [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self._fetch_image_if_missing(context, vi) [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] image_cache(vi, tmp_image_ds_loc) [ 656.745689] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] vm_util.copy_virtual_disk( [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] session._wait_for_task(vmdk_copy_task) [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return self.wait_for_task(task_ref) [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return evt.wait() [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] result = hub.switch() [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] return self.greenlet.switch() [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 656.746176] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] self.f(*self.args, **self.kw) [ 656.746659] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 656.746659] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] raise exceptions.translate_fault(task_info.error) [ 656.746659] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.746659] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Faults: ['InvalidArgument'] [ 656.746659] env[67131]: ERROR nova.compute.manager [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] [ 656.746659] env[67131]: DEBUG nova.compute.utils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 656.750529] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Build of instance d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf was re-scheduled: A specified parameter was not correct: fileType [ 656.750529] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 656.750605] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 656.750881] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquiring lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 656.750927] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Acquired lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 656.751099] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 656.834282] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 657.138706] env[67131]: DEBUG nova.network.neutron [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 657.152642] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Releasing lock "refresh_cache-d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 657.152879] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 657.153106] env[67131]: DEBUG nova.compute.manager [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] [instance: d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf] Skipping network deallocation for instance since networking was not requested. {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 657.269168] env[67131]: INFO nova.scheduler.client.report [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Deleted allocations for instance d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf [ 657.294241] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a6c7fc0-928c-4591-90aa-cb0da8c9a6c4 tempest-ServersAdmin275Test-1632169565 tempest-ServersAdmin275Test-1632169565-project-member] Lock "d1f8a4de-eb82-404a-a0e3-7a7d4f6730bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.876s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.330395] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 657.397087] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.397459] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.398964] env[67131]: INFO nova.compute.claims [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 657.634459] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-268cf368-4eb0-4610-a38a-87196eb6300b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.647335] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8387c0b8-6bc8-4816-b901-f5497de9a3fd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.681680] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-388d3b28-b592-4b72-afd1-b0d3e3fca5bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.690540] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e57d947f-18f5-411e-ab9f-3892a95b5246 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.703998] env[67131]: DEBUG nova.compute.provider_tree [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 657.715208] env[67131]: DEBUG nova.scheduler.client.report [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 657.738728] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.738945] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 657.779057] env[67131]: DEBUG nova.compute.utils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 657.780578] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 657.780748] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 657.795043] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 657.870047] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 657.891564] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 657.891808] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 657.891959] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 657.892149] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 657.892288] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 657.892433] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 657.892642] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 657.892990] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 657.892990] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 657.893217] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 657.893588] env[67131]: DEBUG nova.virt.hardware [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 657.894572] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a8ef142-dfd1-4d2f-837c-caf1aa08015e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.903764] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1b8e7f5-6895-41ed-b4b3-29490dfbf673 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.045830] env[67131]: DEBUG nova.policy [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b0d60fd2e1c4ad2ba8bbd600b673698', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '579ede2e6cd14ded9458ea590d7cc525', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 660.317976] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Successfully created port: aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 663.978968] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Successfully updated port: aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 663.989580] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 663.989738] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquired lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 663.989878] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 664.163290] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 665.284030] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Updating instance_info_cache with network_info: [{"id": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "address": "fa:16:3e:d0:fe:ab", "network": {"id": "2a247441-61a8-40e5-8182-0d7d9b90d1e2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-411264325-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "579ede2e6cd14ded9458ea590d7cc525", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "01fe2e08-46f6-4cee-aefd-934461f8077d", "external-id": "nsx-vlan-transportzone-806", "segmentation_id": 806, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa3247c0-75", "ovs_interfaceid": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 665.300887] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Releasing lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 665.300887] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance network_info: |[{"id": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "address": "fa:16:3e:d0:fe:ab", "network": {"id": "2a247441-61a8-40e5-8182-0d7d9b90d1e2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-411264325-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "579ede2e6cd14ded9458ea590d7cc525", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "01fe2e08-46f6-4cee-aefd-934461f8077d", "external-id": "nsx-vlan-transportzone-806", "segmentation_id": 806, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa3247c0-75", "ovs_interfaceid": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 665.301110] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d0:fe:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '01fe2e08-46f6-4cee-aefd-934461f8077d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aa3247c0-75cd-4c33-9c4a-6ac647993366', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 665.310918] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Creating folder: Project (579ede2e6cd14ded9458ea590d7cc525). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 665.311685] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea455543-f752-4c01-8261-1756eb37cd2e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.325019] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Created folder: Project (579ede2e6cd14ded9458ea590d7cc525) in parent group-v690228. [ 665.325019] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Creating folder: Instances. Parent ref: group-v690259. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 665.325019] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d857a83-0726-4510-84a6-06f5d813af3a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.333721] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Created folder: Instances in parent group-v690259. [ 665.336678] env[67131]: DEBUG oslo.service.loopingcall [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 665.336678] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 665.336678] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b928dd97-5210-4561-961a-2fe7970cbe29 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.359330] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 665.359330] env[67131]: value = "task-3456421" [ 665.359330] env[67131]: _type = "Task" [ 665.359330] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.367231] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456421, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 665.875860] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456421, 'name': CreateVM_Task, 'duration_secs': 0.296412} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 665.876031] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 665.876703] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 665.876865] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 665.877255] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 665.877618] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7e24277c-c979-4226-8fda-0f2c644e3c83 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.882347] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for the task: (returnval){ [ 665.882347] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52d58198-f022-d436-5119-bcef85375dac" [ 665.882347] env[67131]: _type = "Task" [ 665.882347] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 665.892123] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52d58198-f022-d436-5119-bcef85375dac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 666.393461] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 666.393461] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 666.393461] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 666.742461] env[67131]: DEBUG nova.compute.manager [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Received event network-vif-plugged-aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 666.742795] env[67131]: DEBUG oslo_concurrency.lockutils [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] Acquiring lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.743086] env[67131]: DEBUG oslo_concurrency.lockutils [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.743342] env[67131]: DEBUG oslo_concurrency.lockutils [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.743593] env[67131]: DEBUG nova.compute.manager [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] No waiting events found dispatching network-vif-plugged-aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 666.743838] env[67131]: WARNING nova.compute.manager [req-8433e1f4-ca98-440b-b8c2-3c25ecd61650 req-71c06370-2c47-490f-86d9-6dbe7b79cd78 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Received unexpected event network-vif-plugged-aa3247c0-75cd-4c33-9c4a-6ac647993366 for instance with vm_state building and task_state spawning. [ 671.845826] env[67131]: DEBUG nova.compute.manager [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Received event network-changed-aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 671.846133] env[67131]: DEBUG nova.compute.manager [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Refreshing instance network info cache due to event network-changed-aa3247c0-75cd-4c33-9c4a-6ac647993366. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 671.846387] env[67131]: DEBUG oslo_concurrency.lockutils [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] Acquiring lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 671.846387] env[67131]: DEBUG oslo_concurrency.lockutils [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] Acquired lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 671.846551] env[67131]: DEBUG nova.network.neutron [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Refreshing network info cache for port aa3247c0-75cd-4c33-9c4a-6ac647993366 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 673.848540] env[67131]: DEBUG nova.network.neutron [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Updated VIF entry in instance network info cache for port aa3247c0-75cd-4c33-9c4a-6ac647993366. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 673.848540] env[67131]: DEBUG nova.network.neutron [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Updating instance_info_cache with network_info: [{"id": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "address": "fa:16:3e:d0:fe:ab", "network": {"id": "2a247441-61a8-40e5-8182-0d7d9b90d1e2", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-411264325-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "579ede2e6cd14ded9458ea590d7cc525", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "01fe2e08-46f6-4cee-aefd-934461f8077d", "external-id": "nsx-vlan-transportzone-806", "segmentation_id": 806, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa3247c0-75", "ovs_interfaceid": "aa3247c0-75cd-4c33-9c4a-6ac647993366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.860689] env[67131]: DEBUG oslo_concurrency.lockutils [req-09dece58-1870-4414-986e-334578891eed req-490bbfbc-98f5-4f3a-93a7-d0154c34cbc4 service nova] Releasing lock "refresh_cache-7e46e878-1564-4f3b-baa5-5c99d7e04d80" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 678.687064] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.687064] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.710228] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.710228] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 678.711603] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 678.746131] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746224] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746330] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746455] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746579] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746925] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746925] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.746925] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.747160] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.747295] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 678.747417] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 678.747887] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.748237] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.748237] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 679.218068] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.218349] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.232333] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.232562] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.232730] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.233071] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 679.234015] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ac6503-d021-4dd1-9f9c-c6d22e880d67 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.244067] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-339e59b1-e763-4163-aaad-a4fe1ae81aaa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.262538] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c320501d-9d32-465f-b1b3-d41f2e2c7db8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.269944] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef11d204-d9a2-4e6a-a0ec-3070fbf144e7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.299552] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180911MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 679.299717] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.299919] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.457812] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c9a491fe-aff4-4b4f-bcfb-dd56f1010576 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.457812] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance e55f1592-024d-431d-b3a9-63b27513cac4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.457812] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 47856710-dd0d-4d4a-9af4-ae3db29510e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.457812] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458033] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458033] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458033] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b47e3b03-7b84-4305-a55c-577401e5acf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458033] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458158] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.458158] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 679.531422] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 679.531657] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 679.532387] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 679.737441] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-944d3b26-65c7-4f3b-8d82-cd94efbbf04d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.748220] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38e53645-1583-4a44-9ffb-28b50b6c07b6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.778958] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7d0330-24ed-4dc7-a30c-73d3977fa99d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.786325] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ec2cae-70f7-4075-be4b-3e0fb5bf4d10 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.801160] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 679.811626] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 679.826190] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 679.826190] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.823229] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 680.823486] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 680.823486] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.658062] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.658334] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.243259] env[67131]: DEBUG oslo_concurrency.lockutils [None req-784e84c1-62c4-4393-bec3-b559c7d0a06f tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Acquiring lock "a4c08ab4-0633-415b-9e2a-6d9a3857c4cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.243726] env[67131]: DEBUG oslo_concurrency.lockutils [None req-784e84c1-62c4-4393-bec3-b559c7d0a06f tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Lock "a4c08ab4-0633-415b-9e2a-6d9a3857c4cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.671990] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "c8c85f1c-6876-4632-a2d6-a835912d3285" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.672355] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.528958] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.529241] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.327920] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "3b2e1650-ee7f-46a2-94db-1a611384be03" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.328651] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "3b2e1650-ee7f-46a2-94db-1a611384be03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.102455] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a5dfae5-6b8c-4294-8af0-4af23a7ddbe8 tempest-AttachInterfacesTestJSON-2011988282 tempest-AttachInterfacesTestJSON-2011988282-project-member] Acquiring lock "ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.102687] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a5dfae5-6b8c-4294-8af0-4af23a7ddbe8 tempest-AttachInterfacesTestJSON-2011988282 tempest-AttachInterfacesTestJSON-2011988282-project-member] Lock "ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.239396] env[67131]: DEBUG oslo_concurrency.lockutils [None req-b8b17b98-2599-47e2-a17a-4d60b612169a tempest-FloatingIPsAssociationNegativeTestJSON-324741952 tempest-FloatingIPsAssociationNegativeTestJSON-324741952-project-member] Acquiring lock "8275c0cc-71d8-4e9c-a324-3955fe1a9943" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.239698] env[67131]: DEBUG oslo_concurrency.lockutils [None req-b8b17b98-2599-47e2-a17a-4d60b612169a tempest-FloatingIPsAssociationNegativeTestJSON-324741952 tempest-FloatingIPsAssociationNegativeTestJSON-324741952-project-member] Lock "8275c0cc-71d8-4e9c-a324-3955fe1a9943" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.937136] env[67131]: DEBUG oslo_concurrency.lockutils [None req-16e75bcb-ec0c-4d58-a17c-cf2065d7f4eb tempest-AttachVolumeNegativeTest-1306521917 tempest-AttachVolumeNegativeTest-1306521917-project-member] Acquiring lock "691eb0c7-b6f0-45ff-92fb-1e47d38587f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.937655] env[67131]: DEBUG oslo_concurrency.lockutils [None req-16e75bcb-ec0c-4d58-a17c-cf2065d7f4eb tempest-AttachVolumeNegativeTest-1306521917 tempest-AttachVolumeNegativeTest-1306521917-project-member] Lock "691eb0c7-b6f0-45ff-92fb-1e47d38587f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.412819] env[67131]: DEBUG oslo_concurrency.lockutils [None req-42c2489f-75b5-4752-9a2f-cd4c35a7dc3f tempest-ServerGroupTestJSON-306088605 tempest-ServerGroupTestJSON-306088605-project-member] Acquiring lock "14293002-9e0b-4e4c-b4c5-9c726995dde0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.413502] env[67131]: DEBUG oslo_concurrency.lockutils [None req-42c2489f-75b5-4752-9a2f-cd4c35a7dc3f tempest-ServerGroupTestJSON-306088605 tempest-ServerGroupTestJSON-306088605-project-member] Lock "14293002-9e0b-4e4c-b4c5-9c726995dde0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.895727] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba335090-d472-4f0e-92f7-65ea60eedd52 tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Acquiring lock "61b77ab6-94d4-4a69-a2f5-b472215c46e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.896714] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba335090-d472-4f0e-92f7-65ea60eedd52 tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Lock "61b77ab6-94d4-4a69-a2f5-b472215c46e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.706482] env[67131]: DEBUG oslo_concurrency.lockutils [None req-fea2e719-11ff-4518-a3c6-b705cf9c6ed1 tempest-ServersNegativeTestJSON-641837042 tempest-ServersNegativeTestJSON-641837042-project-member] Acquiring lock "fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.706745] env[67131]: DEBUG oslo_concurrency.lockutils [None req-fea2e719-11ff-4518-a3c6-b705cf9c6ed1 tempest-ServersNegativeTestJSON-641837042 tempest-ServersNegativeTestJSON-641837042-project-member] Lock "fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.948799] env[67131]: DEBUG oslo_concurrency.lockutils [None req-6ae82b6f-0fb4-4404-82c0-39de16f44410 tempest-ServersTestBootFromVolume-1447370321 tempest-ServersTestBootFromVolume-1447370321-project-member] Acquiring lock "d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.949061] env[67131]: DEBUG oslo_concurrency.lockutils [None req-6ae82b6f-0fb4-4404-82c0-39de16f44410 tempest-ServersTestBootFromVolume-1447370321 tempest-ServersTestBootFromVolume-1447370321-project-member] Lock "d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.077675] env[67131]: WARNING oslo_vmware.rw_handles [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 705.077675] env[67131]: ERROR oslo_vmware.rw_handles [ 705.081167] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 705.081167] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 705.081167] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Copying Virtual Disk [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/a2b23204-20eb-4c72-80d8-70368864594e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 705.081167] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c7f312ce-fbb3-4039-9fea-422568f96d0a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.090052] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Waiting for the task: (returnval){ [ 705.090052] env[67131]: value = "task-3456437" [ 705.090052] env[67131]: _type = "Task" [ 705.090052] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.099730] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Task: {'id': task-3456437, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.603897] env[67131]: DEBUG oslo_vmware.exceptions [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 705.603897] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.603897] env[67131]: ERROR nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.603897] env[67131]: Faults: ['InvalidArgument'] [ 705.603897] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Traceback (most recent call last): [ 705.603897] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 705.603897] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] yield resources [ 705.603897] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 705.603897] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self.driver.spawn(context, instance, image_meta, [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self._fetch_image_if_missing(context, vi) [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] image_cache(vi, tmp_image_ds_loc) [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] vm_util.copy_virtual_disk( [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] session._wait_for_task(vmdk_copy_task) [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return self.wait_for_task(task_ref) [ 705.604150] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return evt.wait() [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] result = hub.switch() [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return self.greenlet.switch() [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self.f(*self.args, **self.kw) [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] raise exceptions.translate_fault(task_info.error) [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Faults: ['InvalidArgument'] [ 705.604542] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] [ 705.604542] env[67131]: INFO nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Terminating instance [ 705.606839] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.606944] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 705.607813] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 705.608259] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 705.608530] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-444f0d1b-aa27-423d-96b3-2b5da9c10434 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.611327] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3a3777c-7d2e-436e-ae1f-37491475c440 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.618739] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 705.619017] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-36910daa-6b92-46d0-9ead-0130ec2744b3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.621596] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 705.621807] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 705.622863] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-279dbb18-850a-43bd-ac08-aac298cb5816 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.627795] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for the task: (returnval){ [ 705.627795] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]527e40c4-c169-4a6a-397b-5d31e1d820a5" [ 705.627795] env[67131]: _type = "Task" [ 705.627795] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.635240] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]527e40c4-c169-4a6a-397b-5d31e1d820a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.679786] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 705.680017] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 705.680212] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Deleting the datastore file [datastore1] e55f1592-024d-431d-b3a9-63b27513cac4 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 705.680488] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-72f9ab95-7321-4610-83ee-be76a3186845 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.686946] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Waiting for the task: (returnval){ [ 705.686946] env[67131]: value = "task-3456439" [ 705.686946] env[67131]: _type = "Task" [ 705.686946] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.695895] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Task: {'id': task-3456439, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 706.139592] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 706.141336] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Creating directory with path [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 706.141336] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99b85fe3-7010-4b6b-a938-0c8890ab2198 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.155108] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Created directory with path [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 706.157157] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Fetch image to [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 706.157400] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 706.158214] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41f9f4a2-1dd2-40ba-873a-e5f2a8b8557d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.167029] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a69382ef-b835-42b0-aca1-1d6347edf3e1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.180347] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8a41591-a2c7-4759-bd06-680d14a6dee1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.227264] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee51ec0-2bd4-415d-b2c3-822fcf68d93f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.236396] env[67131]: DEBUG oslo_vmware.api [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Task: {'id': task-3456439, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068678} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 706.237363] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 706.237634] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 706.239666] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.239666] env[67131]: INFO nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Took 0.63 seconds to destroy the instance on the hypervisor. [ 706.240163] env[67131]: DEBUG nova.compute.claims [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 706.241073] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.241073] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.243724] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-863dba0c-880c-4c28-8694-c9fa63f596df {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.268716] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 706.321733] env[67131]: DEBUG oslo_vmware.rw_handles [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 706.385644] env[67131]: DEBUG oslo_vmware.rw_handles [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 706.385843] env[67131]: DEBUG oslo_vmware.rw_handles [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 706.702637] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3977f2-4d1f-4cda-b526-ce64555cb3a1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.710437] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e68e4a3a-2851-40c3-bd6e-c11de0e50ccc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.745229] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f054b19-56bd-4325-bf64-075a42575f64 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.753952] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8f2e048-7a4b-4e0e-8e1a-08cdf8b0f917 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.768785] env[67131]: DEBUG nova.compute.provider_tree [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.779724] env[67131]: DEBUG nova.scheduler.client.report [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.795338] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.554s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.795338] env[67131]: ERROR nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.795338] env[67131]: Faults: ['InvalidArgument'] [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Traceback (most recent call last): [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self.driver.spawn(context, instance, image_meta, [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 706.795338] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self._fetch_image_if_missing(context, vi) [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] image_cache(vi, tmp_image_ds_loc) [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] vm_util.copy_virtual_disk( [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] session._wait_for_task(vmdk_copy_task) [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return self.wait_for_task(task_ref) [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return evt.wait() [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] result = hub.switch() [ 706.795771] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] return self.greenlet.switch() [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] self.f(*self.args, **self.kw) [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] raise exceptions.translate_fault(task_info.error) [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Faults: ['InvalidArgument'] [ 706.796064] env[67131]: ERROR nova.compute.manager [instance: e55f1592-024d-431d-b3a9-63b27513cac4] [ 706.796064] env[67131]: DEBUG nova.compute.utils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 706.798022] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Build of instance e55f1592-024d-431d-b3a9-63b27513cac4 was re-scheduled: A specified parameter was not correct: fileType [ 706.798022] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 706.798022] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 706.798022] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 706.798022] env[67131]: DEBUG nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 706.798226] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.603980] env[67131]: DEBUG nova.network.neutron [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.615534] env[67131]: INFO nova.compute.manager [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] [instance: e55f1592-024d-431d-b3a9-63b27513cac4] Took 0.82 seconds to deallocate network for instance. [ 707.726701] env[67131]: INFO nova.scheduler.client.report [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Deleted allocations for instance e55f1592-024d-431d-b3a9-63b27513cac4 [ 707.749417] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0e0c320c-473e-4c8b-a1b8-1b716fbba07b tempest-ServerDiagnosticsTest-1691387679 tempest-ServerDiagnosticsTest-1691387679-project-member] Lock "e55f1592-024d-431d-b3a9-63b27513cac4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 104.635s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.782760] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 707.864307] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.864307] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.865765] env[67131]: INFO nova.compute.claims [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 708.235249] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c11dd3f-f13c-472e-b009-888bc22f89bd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.243396] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02486094-18fa-4596-aaaa-00a7ab12a82a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.280058] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266ef5ff-d3fe-4dbf-bfbf-4cd03016a5f3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.287223] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff340690-da62-460a-a1d6-d68f79f0e6df {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.303096] env[67131]: DEBUG nova.compute.provider_tree [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 708.317406] env[67131]: DEBUG nova.scheduler.client.report [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 708.339946] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.478s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.340556] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 708.376797] env[67131]: DEBUG nova.compute.utils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 708.380092] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 708.380092] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 708.391538] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 708.476251] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 708.502840] env[67131]: DEBUG nova.policy [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7dbd792c918e484ba2fbf7da6fb871c8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ce0eff99ed747fcbb6b62d0e8ff1f0e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 708.516610] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 708.516848] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 708.517013] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 708.517762] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 708.517935] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 708.518149] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 708.518315] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 708.518476] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 708.518651] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 708.519839] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 708.519839] env[67131]: DEBUG nova.virt.hardware [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 708.519943] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd422fb6-0967-4340-9cd3-0e1274791a4d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.534542] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b364803c-3dbb-4832-87e4-5f93247e4b6b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.110104] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Successfully created port: b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 710.391713] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "c5368926-ed52-414f-9342-27c71e4e3557" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.392030] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "c5368926-ed52-414f-9342-27c71e4e3557" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.466765] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Successfully updated port: b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 710.482576] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.482729] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquired lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.482879] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 710.555923] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.871976] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Updating instance_info_cache with network_info: [{"id": "b040a812-4e4c-48e9-93ab-e53374566852", "address": "fa:16:3e:6e:3b:6b", "network": {"id": "e9354b9b-8621-4af2-adab-3dc8b4968f8d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1527964771-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ce0eff99ed747fcbb6b62d0e8ff1f0e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb040a812-4e", "ovs_interfaceid": "b040a812-4e4c-48e9-93ab-e53374566852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.887581] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Releasing lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.887911] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Instance network_info: |[{"id": "b040a812-4e4c-48e9-93ab-e53374566852", "address": "fa:16:3e:6e:3b:6b", "network": {"id": "e9354b9b-8621-4af2-adab-3dc8b4968f8d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1527964771-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ce0eff99ed747fcbb6b62d0e8ff1f0e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb040a812-4e", "ovs_interfaceid": "b040a812-4e4c-48e9-93ab-e53374566852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 710.888774] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:3b:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd182e8eb-3f6d-4c76-a06e-133dd9b3cd30', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b040a812-4e4c-48e9-93ab-e53374566852', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 710.898738] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Creating folder: Project (6ce0eff99ed747fcbb6b62d0e8ff1f0e). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 710.899581] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2997c76f-2650-4904-96c5-19fbf13c3fd0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.913078] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Created folder: Project (6ce0eff99ed747fcbb6b62d0e8ff1f0e) in parent group-v690228. [ 710.913335] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Creating folder: Instances. Parent ref: group-v690270. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 710.913598] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-77872e4e-b21e-4882-aedd-6de84dfd8b76 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.918615] env[67131]: DEBUG nova.compute.manager [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Received event network-vif-plugged-b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 710.918840] env[67131]: DEBUG oslo_concurrency.lockutils [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] Acquiring lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.919359] env[67131]: DEBUG oslo_concurrency.lockutils [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] Lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.919532] env[67131]: DEBUG oslo_concurrency.lockutils [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] Lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.919698] env[67131]: DEBUG nova.compute.manager [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] No waiting events found dispatching network-vif-plugged-b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 710.919861] env[67131]: WARNING nova.compute.manager [req-b14d11a1-f27a-49e2-97bf-95871c59d919 req-0c5ada3d-a851-43e4-9d3e-4bb3d68e9061 service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Received unexpected event network-vif-plugged-b040a812-4e4c-48e9-93ab-e53374566852 for instance with vm_state building and task_state spawning. [ 710.924191] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Created folder: Instances in parent group-v690270. [ 710.924431] env[67131]: DEBUG oslo.service.loopingcall [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 710.924648] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 710.924800] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fc07e205-c161-4ad6-a0ba-23c7f5cf1922 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.947310] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 710.947310] env[67131]: value = "task-3456445" [ 710.947310] env[67131]: _type = "Task" [ 710.947310] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 710.956272] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456445, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 711.467222] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456445, 'name': CreateVM_Task, 'duration_secs': 0.387734} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 711.467222] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 711.467222] env[67131]: DEBUG oslo_vmware.service [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b543508-c9af-43ca-b6a5-5908ec07169a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.472890] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.473108] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.473557] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 711.473815] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-abb37880-a5f5-47b3-810d-88621553ba9f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.478916] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Waiting for the task: (returnval){ [ 711.478916] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a3cdee-d728-23cc-413d-95ff25585e58" [ 711.478916] env[67131]: _type = "Task" [ 711.478916] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.487313] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a3cdee-d728-23cc-413d-95ff25585e58, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 711.991155] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.991407] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 711.991638] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.991786] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.991965] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.992218] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-719be329-625d-412c-89a6-16a711f402e6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.013738] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.013934] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 712.014742] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d93984d-da97-4681-a317-4c50008aaa2f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.022059] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6bf6a961-958f-403c-807f-bb01334a5e0f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.027931] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Waiting for the task: (returnval){ [ 712.027931] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]524d5346-8d5b-e860-cb50-99d7bf307b89" [ 712.027931] env[67131]: _type = "Task" [ 712.027931] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 712.036066] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]524d5346-8d5b-e860-cb50-99d7bf307b89, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.538924] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 712.539288] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Creating directory with path [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 712.539444] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-74fbfaaf-7030-4c9d-a556-c0f3e3c94926 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.577516] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Created directory with path [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.577709] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Fetch image to [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 712.577868] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore2 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 712.578787] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8844f793-fcdf-4ce0-9227-bafbad240a13 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.590690] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2adeaeb-cbc9-42f8-a0de-48e21633c8cc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.602502] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-620ba636-099e-4286-bb27-853acc9753c7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.635457] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c06620e9-4760-4f56-8094-1cb0c9137f19 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.642959] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e14b0394-0cf6-45b5-8852-9b2a82fde3a0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.666326] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore2 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 712.721407] env[67131]: DEBUG oslo_vmware.rw_handles [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 712.783215] env[67131]: DEBUG oslo_vmware.rw_handles [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 712.783428] env[67131]: DEBUG oslo_vmware.rw_handles [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 713.175262] env[67131]: DEBUG nova.compute.manager [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Received event network-changed-b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 713.175262] env[67131]: DEBUG nova.compute.manager [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Refreshing instance network info cache due to event network-changed-b040a812-4e4c-48e9-93ab-e53374566852. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 713.175262] env[67131]: DEBUG oslo_concurrency.lockutils [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] Acquiring lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.175262] env[67131]: DEBUG oslo_concurrency.lockutils [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] Acquired lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.175262] env[67131]: DEBUG nova.network.neutron [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Refreshing network info cache for port b040a812-4e4c-48e9-93ab-e53374566852 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 713.683667] env[67131]: DEBUG nova.network.neutron [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Updated VIF entry in instance network info cache for port b040a812-4e4c-48e9-93ab-e53374566852. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 713.684165] env[67131]: DEBUG nova.network.neutron [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Updating instance_info_cache with network_info: [{"id": "b040a812-4e4c-48e9-93ab-e53374566852", "address": "fa:16:3e:6e:3b:6b", "network": {"id": "e9354b9b-8621-4af2-adab-3dc8b4968f8d", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1527964771-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6ce0eff99ed747fcbb6b62d0e8ff1f0e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d182e8eb-3f6d-4c76-a06e-133dd9b3cd30", "external-id": "nsx-vlan-transportzone-260", "segmentation_id": 260, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb040a812-4e", "ovs_interfaceid": "b040a812-4e4c-48e9-93ab-e53374566852", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.697144] env[67131]: DEBUG oslo_concurrency.lockutils [req-14aa3337-21f5-4fcb-b76d-10873741a977 req-2166d568-bbc2-402d-b589-321225a434cd service nova] Releasing lock "refresh_cache-9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.564781] env[67131]: DEBUG oslo_concurrency.lockutils [None req-5f440b03-70d1-47a0-ab9f-8d31fbe9aaf0 tempest-InstanceActionsNegativeTestJSON-524430231 tempest-InstanceActionsNegativeTestJSON-524430231-project-member] Acquiring lock "b1b04cd3-c691-4689-a7c4-d97798668092" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.564781] env[67131]: DEBUG oslo_concurrency.lockutils [None req-5f440b03-70d1-47a0-ab9f-8d31fbe9aaf0 tempest-InstanceActionsNegativeTestJSON-524430231 tempest-InstanceActionsNegativeTestJSON-524430231-project-member] Lock "b1b04cd3-c691-4689-a7c4-d97798668092" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.438016] env[67131]: DEBUG oslo_concurrency.lockutils [None req-1ff5c02f-bcca-4e5a-a77f-044999f529c7 tempest-ServerActionsV293TestJSON-563675839 tempest-ServerActionsV293TestJSON-563675839-project-member] Acquiring lock "12dd51ad-bb48-4166-a208-0c8f6dd044fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.438377] env[67131]: DEBUG oslo_concurrency.lockutils [None req-1ff5c02f-bcca-4e5a-a77f-044999f529c7 tempest-ServerActionsV293TestJSON-563675839 tempest-ServerActionsV293TestJSON-563675839-project-member] Lock "12dd51ad-bb48-4166-a208-0c8f6dd044fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.211817] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.215930] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.216220] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 739.216253] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 739.236736] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.236901] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237045] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237180] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237305] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237508] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237637] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237760] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237879] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.237997] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 739.238130] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 739.238597] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.238779] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.238918] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 740.215993] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.215993] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.226803] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.227047] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.227218] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.227370] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 740.228542] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffcd39b0-616e-4f7c-be76-2409ead9f912 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.237450] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de471b95-70a6-4283-ad6a-b015418af183 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.251011] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11328c67-79d8-4eb6-adc9-d8a7b236b4c1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.257403] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0427b5df-2792-4eae-9c1c-b69b84531a2b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.286373] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180883MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 740.286558] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.286752] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.354143] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c9a491fe-aff4-4b4f-bcfb-dd56f1010576 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354279] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 47856710-dd0d-4d4a-9af4-ae3db29510e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354407] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354531] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354684] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354762] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b47e3b03-7b84-4305-a55c-577401e5acf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354879] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.354997] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.355133] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.355249] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.378598] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.389397] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c8c85f1c-6876-4632-a2d6-a835912d3285 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.400597] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.413380] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.423546] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.435724] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8275c0cc-71d8-4e9c-a324-3955fe1a9943 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.445764] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 691eb0c7-b6f0-45ff-92fb-1e47d38587f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.456024] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 14293002-9e0b-4e4c-b4c5-9c726995dde0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.468466] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 61b77ab6-94d4-4a69-a2f5-b472215c46e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.480154] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.490828] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.505241] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c5368926-ed52-414f-9342-27c71e4e3557 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.514450] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b1b04cd3-c691-4689-a7c4-d97798668092 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.522201] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 12dd51ad-bb48-4166-a208-0c8f6dd044fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 740.522427] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 740.522570] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 740.793695] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a09e497-cba8-4734-b7ce-4c910ba5a94c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.801096] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16982162-989c-4439-9e8d-6057d302de2e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.830517] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5762bf2-94f7-48d2-849f-d16c6a5273ca {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.837915] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c38ee60-2c87-4b2b-91fc-46157ea03387 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.851293] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.859413] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.873911] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 740.874138] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.874326] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.215554] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.215785] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.859056] env[67131]: WARNING oslo_vmware.rw_handles [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 752.859056] env[67131]: ERROR oslo_vmware.rw_handles [ 752.859056] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 752.860659] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 752.860906] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Copying Virtual Disk [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/75d6314c-88cd-4743-83ee-9f07b66846fd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 752.861277] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bd53da9c-16a0-4ca6-8ddf-50f586e6fdfd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.869225] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for the task: (returnval){ [ 752.869225] env[67131]: value = "task-3456449" [ 752.869225] env[67131]: _type = "Task" [ 752.869225] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 752.877155] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Task: {'id': task-3456449, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 753.379803] env[67131]: DEBUG oslo_vmware.exceptions [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 753.380051] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.380593] env[67131]: ERROR nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 753.380593] env[67131]: Faults: ['InvalidArgument'] [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Traceback (most recent call last): [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] yield resources [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self.driver.spawn(context, instance, image_meta, [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self._fetch_image_if_missing(context, vi) [ 753.380593] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] image_cache(vi, tmp_image_ds_loc) [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] vm_util.copy_virtual_disk( [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] session._wait_for_task(vmdk_copy_task) [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return self.wait_for_task(task_ref) [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return evt.wait() [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] result = hub.switch() [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.380907] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return self.greenlet.switch() [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self.f(*self.args, **self.kw) [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] raise exceptions.translate_fault(task_info.error) [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Faults: ['InvalidArgument'] [ 753.381335] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] [ 753.381335] env[67131]: INFO nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Terminating instance [ 753.382500] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.382700] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 753.382927] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4712549e-f72d-4c0b-b296-5f2e15a21e0c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.385169] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.385322] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquired lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.385485] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 753.393220] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 753.393220] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 753.393453] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a771c34-550e-4022-b222-b2b0d083477c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.400287] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Waiting for the task: (returnval){ [ 753.400287] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52ca2409-61ec-6d31-0b78-ec4cd868d2c9" [ 753.400287] env[67131]: _type = "Task" [ 753.400287] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 753.407836] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52ca2409-61ec-6d31-0b78-ec4cd868d2c9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 753.424711] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.543373] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.552558] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Releasing lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.552931] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 753.553131] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 753.554174] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bb42cda-d56f-45fb-a1f0-ab517f17f6f7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.561891] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 753.562128] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-90377a42-85a2-421c-be26-a57be2f13504 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.588716] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 753.588927] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 753.589222] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Deleting the datastore file [datastore1] cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 753.589457] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-de8da522-425b-4a4f-b8bd-db26b69729b6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.595603] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for the task: (returnval){ [ 753.595603] env[67131]: value = "task-3456451" [ 753.595603] env[67131]: _type = "Task" [ 753.595603] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 753.602946] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Task: {'id': task-3456451, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 753.910088] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 753.910378] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Creating directory with path [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 753.910558] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8633589f-206d-430e-b93b-ead566e52ce2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.921690] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Created directory with path [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 753.921877] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Fetch image to [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 753.922085] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 753.922755] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-680c4795-104a-437b-b64b-7bfcceecef5f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.928977] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-007b7122-12cf-4d85-9ed3-c248190bfb4f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.937771] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c9e615d-e977-440a-ba72-2e54f4baa16e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.968460] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-483871e0-8b02-42ac-9db0-04f31307deca {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.973629] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-120e6a71-7279-411c-9e63-b2711a6f8441 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.005296] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 754.048887] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 754.106828] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 754.106987] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 754.110627] env[67131]: DEBUG oslo_vmware.api [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Task: {'id': task-3456451, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044223} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 754.112128] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 754.112128] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 754.112128] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 754.112128] env[67131]: INFO nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Took 0.56 seconds to destroy the instance on the hypervisor. [ 754.112128] env[67131]: DEBUG oslo.service.loopingcall [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 754.112322] env[67131]: DEBUG nova.compute.manager [-] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Skipping network deallocation for instance since networking was not requested. {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 754.113842] env[67131]: DEBUG nova.compute.claims [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 754.114015] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.114231] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.408067] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f944e58-6b62-439e-9a46-6c7f5f82fbda {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.416216] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f9f790-0241-49d9-a9c9-86c9313c5b08 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.446870] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ebd0053-d356-4210-9bf3-630753ebe2f5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.454096] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62cdaa9a-33d5-4639-bfb6-8f145c99b61a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.467184] env[67131]: DEBUG nova.compute.provider_tree [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.475454] env[67131]: DEBUG nova.scheduler.client.report [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.490797] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.376s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.491393] env[67131]: ERROR nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 754.491393] env[67131]: Faults: ['InvalidArgument'] [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Traceback (most recent call last): [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self.driver.spawn(context, instance, image_meta, [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self._fetch_image_if_missing(context, vi) [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] image_cache(vi, tmp_image_ds_loc) [ 754.491393] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] vm_util.copy_virtual_disk( [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] session._wait_for_task(vmdk_copy_task) [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return self.wait_for_task(task_ref) [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return evt.wait() [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] result = hub.switch() [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] return self.greenlet.switch() [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 754.491862] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] self.f(*self.args, **self.kw) [ 754.492301] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 754.492301] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] raise exceptions.translate_fault(task_info.error) [ 754.492301] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 754.492301] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Faults: ['InvalidArgument'] [ 754.492301] env[67131]: ERROR nova.compute.manager [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] [ 754.492301] env[67131]: DEBUG nova.compute.utils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 754.493452] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Build of instance cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 was re-scheduled: A specified parameter was not correct: fileType [ 754.493452] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 754.493850] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 754.494041] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquiring lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.494187] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Acquired lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.494345] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.527465] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.621992] env[67131]: DEBUG nova.network.neutron [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.630726] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Releasing lock "refresh_cache-cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.630936] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 754.631182] env[67131]: DEBUG nova.compute.manager [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] [instance: cf2feb3e-6cf3-4db2-83fd-beb5e8387e81] Skipping network deallocation for instance since networking was not requested. {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 754.711596] env[67131]: INFO nova.scheduler.client.report [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Deleted allocations for instance cf2feb3e-6cf3-4db2-83fd-beb5e8387e81 [ 754.731941] env[67131]: DEBUG oslo_concurrency.lockutils [None req-553f945d-8d25-4703-b659-738cd447059a tempest-ServerDiagnosticsV248Test-1690652633 tempest-ServerDiagnosticsV248Test-1690652633-project-member] Lock "cf2feb3e-6cf3-4db2-83fd-beb5e8387e81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 146.879s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.760810] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 754.808891] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.809167] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.810648] env[67131]: INFO nova.compute.claims [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 755.102449] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d959ae41-f583-47ca-b6c9-11dc812bb9ff {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.109690] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf7de7b-f3a9-45b5-b188-5105372c0c2c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.139330] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c86e094-80be-4e35-bbe0-bf3f98ca8991 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.145925] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe801c77-85f4-4c5f-abaf-e61677eb129d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.158337] env[67131]: DEBUG nova.compute.provider_tree [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 755.166931] env[67131]: DEBUG nova.scheduler.client.report [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 755.183050] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.183515] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 755.212996] env[67131]: DEBUG nova.compute.utils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 755.214568] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 755.214838] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 755.223383] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 755.285236] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 755.306230] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 755.306478] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 755.306631] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 755.306811] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 755.306954] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 755.307114] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 755.307318] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 755.307474] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 755.307638] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 755.307795] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 755.307964] env[67131]: DEBUG nova.virt.hardware [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 755.308811] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab8aedf-a5ab-4051-a277-d23c35aa22bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.316542] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f3e7097-36c9-498a-99a1-a85f9b7c3e5b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.321545] env[67131]: DEBUG nova.policy [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9df0e19972d9477386e4fc56ac48fc7c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd928c127fda4f18b2e0d88e98aa1691', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 755.800766] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Successfully created port: 5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 756.982293] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Successfully updated port: 5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 756.991651] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.991800] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquired lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.992204] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 757.052334] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.213666] env[67131]: DEBUG nova.compute.manager [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Received event network-vif-plugged-5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 757.213666] env[67131]: DEBUG oslo_concurrency.lockutils [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] Acquiring lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.213791] env[67131]: DEBUG oslo_concurrency.lockutils [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] Lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.213943] env[67131]: DEBUG oslo_concurrency.lockutils [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] Lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.214152] env[67131]: DEBUG nova.compute.manager [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] No waiting events found dispatching network-vif-plugged-5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 757.214618] env[67131]: WARNING nova.compute.manager [req-24dd0c5a-a3b4-464f-a5db-e6b227da7ddd req-e24adcac-991e-4246-bf32-b717cb3a9c5c service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Received unexpected event network-vif-plugged-5606f9ef-2dff-4b18-b3fd-b9b295425170 for instance with vm_state building and task_state spawning. [ 757.347193] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Updating instance_info_cache with network_info: [{"id": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "address": "fa:16:3e:3c:d8:87", "network": {"id": "af295a9e-efd6-4477-80aa-4a5a693c5086", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-885137633-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd928c127fda4f18b2e0d88e98aa1691", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5606f9ef-2d", "ovs_interfaceid": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.357137] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Releasing lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.357563] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance network_info: |[{"id": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "address": "fa:16:3e:3c:d8:87", "network": {"id": "af295a9e-efd6-4477-80aa-4a5a693c5086", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-885137633-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd928c127fda4f18b2e0d88e98aa1691", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5606f9ef-2d", "ovs_interfaceid": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 757.358294] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3c:d8:87', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b399c74-1411-408a-b4cd-84e268ae83fe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5606f9ef-2dff-4b18-b3fd-b9b295425170', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 757.367383] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Creating folder: Project (bd928c127fda4f18b2e0d88e98aa1691). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 757.368229] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b51816a4-f50d-4dc1-87ff-0be2f1980055 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.379275] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Created folder: Project (bd928c127fda4f18b2e0d88e98aa1691) in parent group-v690228. [ 757.381020] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Creating folder: Instances. Parent ref: group-v690273. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 757.381020] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1020ba09-3d69-4b64-9d19-e47f34026e7b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.391015] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Created folder: Instances in parent group-v690273. [ 757.391015] env[67131]: DEBUG oslo.service.loopingcall [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 757.391015] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 757.391015] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e7299c12-a6dc-4df9-b058-722521d553cb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.409722] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 757.409722] env[67131]: value = "task-3456454" [ 757.409722] env[67131]: _type = "Task" [ 757.409722] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 757.417455] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456454, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 757.919711] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456454, 'name': CreateVM_Task, 'duration_secs': 0.395026} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 757.919886] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 757.920609] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.920840] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 757.921097] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 757.921391] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0dfa1691-6bec-459a-b411-a25308f0bc86 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.925576] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Waiting for the task: (returnval){ [ 757.925576] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52e0dcad-3e03-1c5b-c445-6596031acf89" [ 757.925576] env[67131]: _type = "Task" [ 757.925576] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 757.935447] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52e0dcad-3e03-1c5b-c445-6596031acf89, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 758.436396] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.436700] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 758.436919] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.578439] env[67131]: DEBUG nova.compute.manager [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Received event network-changed-5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 759.578439] env[67131]: DEBUG nova.compute.manager [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Refreshing instance network info cache due to event network-changed-5606f9ef-2dff-4b18-b3fd-b9b295425170. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 759.578439] env[67131]: DEBUG oslo_concurrency.lockutils [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] Acquiring lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.578439] env[67131]: DEBUG oslo_concurrency.lockutils [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] Acquired lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.578439] env[67131]: DEBUG nova.network.neutron [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Refreshing network info cache for port 5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 760.066613] env[67131]: DEBUG nova.network.neutron [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Updated VIF entry in instance network info cache for port 5606f9ef-2dff-4b18-b3fd-b9b295425170. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 760.067036] env[67131]: DEBUG nova.network.neutron [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Updating instance_info_cache with network_info: [{"id": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "address": "fa:16:3e:3c:d8:87", "network": {"id": "af295a9e-efd6-4477-80aa-4a5a693c5086", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-885137633-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd928c127fda4f18b2e0d88e98aa1691", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b399c74-1411-408a-b4cd-84e268ae83fe", "external-id": "nsx-vlan-transportzone-486", "segmentation_id": 486, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5606f9ef-2d", "ovs_interfaceid": "5606f9ef-2dff-4b18-b3fd-b9b295425170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.084290] env[67131]: DEBUG oslo_concurrency.lockutils [req-9bfd71de-c798-42e5-8021-f129f7cb481d req-c13cc9a9-d3f5-469e-af50-bee376f69a80 service nova] Releasing lock "refresh_cache-aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.099877] env[67131]: WARNING oslo_vmware.rw_handles [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 760.099877] env[67131]: ERROR oslo_vmware.rw_handles [ 760.099877] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore2 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 760.100985] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 760.101280] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Copying Virtual Disk [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore2] vmware_temp/637d9a04-5ccf-4c2c-8dba-62feebaec5f3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 760.102070] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d7fcf91b-e7d4-41dc-a58c-ea6340b1cae7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.110894] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Waiting for the task: (returnval){ [ 760.110894] env[67131]: value = "task-3456455" [ 760.110894] env[67131]: _type = "Task" [ 760.110894] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.119754] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Task: {'id': task-3456455, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 760.621294] env[67131]: DEBUG oslo_vmware.exceptions [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 760.621616] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.622128] env[67131]: ERROR nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.622128] env[67131]: Faults: ['InvalidArgument'] [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Traceback (most recent call last): [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] yield resources [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self.driver.spawn(context, instance, image_meta, [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self._fetch_image_if_missing(context, vi) [ 760.622128] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] image_cache(vi, tmp_image_ds_loc) [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] vm_util.copy_virtual_disk( [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] session._wait_for_task(vmdk_copy_task) [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return self.wait_for_task(task_ref) [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return evt.wait() [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] result = hub.switch() [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.622465] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return self.greenlet.switch() [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self.f(*self.args, **self.kw) [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] raise exceptions.translate_fault(task_info.error) [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Faults: ['InvalidArgument'] [ 760.622772] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] [ 760.622772] env[67131]: INFO nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Terminating instance [ 760.624615] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.624810] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.625565] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da339702-d132-4f70-ad39-ef96ca687a62 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.632187] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 760.632436] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9dbdd35-8722-42d0-aaf2-23b6334142e5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.690018] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 760.690262] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Deleting contents of the VM from datastore datastore2 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 760.690435] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Deleting the datastore file [datastore2] 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 760.690688] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23fc7bac-9851-4f06-b594-b13fcb96a040 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.696805] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Waiting for the task: (returnval){ [ 760.696805] env[67131]: value = "task-3456457" [ 760.696805] env[67131]: _type = "Task" [ 760.696805] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.704322] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Task: {'id': task-3456457, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 761.206549] env[67131]: DEBUG oslo_vmware.api [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Task: {'id': task-3456457, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066015} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 761.206797] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 761.206975] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Deleted contents of the VM from datastore datastore2 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 761.207161] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 761.207333] env[67131]: INFO nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Took 0.58 seconds to destroy the instance on the hypervisor. [ 761.209531] env[67131]: DEBUG nova.compute.claims [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 761.209702] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.209907] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.537929] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b56049-9c3b-48b5-b3ac-734f585d6a0a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.546797] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35575263-332b-48b1-9e25-2f50aea94510 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.576598] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2afad12a-35f4-400c-963f-5265bf88fc2c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.584911] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8bb1c18-a19a-41be-893a-81d1f26445c7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.600082] env[67131]: DEBUG nova.compute.provider_tree [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.607278] env[67131]: DEBUG nova.scheduler.client.report [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.620798] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.411s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.621363] env[67131]: ERROR nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.621363] env[67131]: Faults: ['InvalidArgument'] [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Traceback (most recent call last): [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self.driver.spawn(context, instance, image_meta, [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self._fetch_image_if_missing(context, vi) [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] image_cache(vi, tmp_image_ds_loc) [ 761.621363] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] vm_util.copy_virtual_disk( [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] session._wait_for_task(vmdk_copy_task) [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return self.wait_for_task(task_ref) [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return evt.wait() [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] result = hub.switch() [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] return self.greenlet.switch() [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 761.622075] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] self.f(*self.args, **self.kw) [ 761.622760] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 761.622760] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] raise exceptions.translate_fault(task_info.error) [ 761.622760] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.622760] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Faults: ['InvalidArgument'] [ 761.622760] env[67131]: ERROR nova.compute.manager [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] [ 761.622760] env[67131]: DEBUG nova.compute.utils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 761.623513] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Build of instance 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208 was re-scheduled: A specified parameter was not correct: fileType [ 761.623513] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 761.623877] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 761.624222] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 761.624222] env[67131]: DEBUG nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 761.624368] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.203142] env[67131]: DEBUG nova.network.neutron [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.212528] env[67131]: INFO nova.compute.manager [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] [instance: 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208] Took 0.59 seconds to deallocate network for instance. [ 762.316982] env[67131]: INFO nova.scheduler.client.report [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Deleted allocations for instance 9ebfb760-6b10-4ea7-8276-f5e8b1ec6208 [ 762.333413] env[67131]: DEBUG oslo_concurrency.lockutils [None req-567c45a6-df0a-45a6-9c2e-b00e2ac1601e tempest-VolumesAssistedSnapshotsTest-1314990226 tempest-VolumesAssistedSnapshotsTest-1314990226-project-member] Lock "9ebfb760-6b10-4ea7-8276-f5e8b1ec6208" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 135.220s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.354324] env[67131]: DEBUG nova.compute.manager [None req-784e84c1-62c4-4393-bec3-b559c7d0a06f tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] [instance: a4c08ab4-0633-415b-9e2a-6d9a3857c4cd] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 762.383335] env[67131]: DEBUG nova.compute.manager [None req-784e84c1-62c4-4393-bec3-b559c7d0a06f tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] [instance: a4c08ab4-0633-415b-9e2a-6d9a3857c4cd] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 762.411212] env[67131]: DEBUG oslo_concurrency.lockutils [None req-784e84c1-62c4-4393-bec3-b559c7d0a06f tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Lock "a4c08ab4-0633-415b-9e2a-6d9a3857c4cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 68.168s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.421727] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 762.482251] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.482554] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.484055] env[67131]: INFO nova.compute.claims [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 762.826717] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-638f7c36-22b0-4d88-bb0d-49d2de2104bf {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.834685] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8221d4d6-4650-4bf6-9964-15471204f299 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.864072] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-205d878a-8c12-41fe-9ebc-cead59703b3a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.871534] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56e3898d-cd09-42b2-bfb1-977f6c93a318 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.884846] env[67131]: DEBUG nova.compute.provider_tree [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.893472] env[67131]: DEBUG nova.scheduler.client.report [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.907227] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.425s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.907735] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 762.938488] env[67131]: DEBUG nova.compute.utils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 762.941408] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 762.941408] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 762.950853] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 763.013732] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 763.035475] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 763.035733] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 763.035889] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 763.036161] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 763.036342] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 763.036493] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 763.036701] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 763.036860] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 763.037036] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 763.037204] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 763.037378] env[67131]: DEBUG nova.virt.hardware [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 763.038230] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a00ea5-8739-45b4-95d6-4894c3376303 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.046226] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a90da32-7ad9-4abf-b1ff-f6dacce4140a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.076093] env[67131]: DEBUG nova.policy [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '323c8bba8b294677a406ecc7abe35353', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4a80fb8d6b94a8c874d49ef38dc3169', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 763.928055] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Successfully created port: 97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 764.583769] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Successfully created port: 3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 765.811228] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Successfully updated port: 97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 766.487233] env[67131]: DEBUG nova.compute.manager [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-vif-plugged-97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 766.487558] env[67131]: DEBUG oslo_concurrency.lockutils [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] Acquiring lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.487642] env[67131]: DEBUG oslo_concurrency.lockutils [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.487799] env[67131]: DEBUG oslo_concurrency.lockutils [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.487963] env[67131]: DEBUG nova.compute.manager [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] No waiting events found dispatching network-vif-plugged-97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 766.488180] env[67131]: WARNING nova.compute.manager [req-fd27453b-0973-434f-a259-d4eb89be36db req-78aef794-7164-467c-aa90-067973e4b8ca service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received unexpected event network-vif-plugged-97081141-54df-4111-a454-195d22f7e34e for instance with vm_state building and task_state spawning. [ 766.953477] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Successfully updated port: 3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 766.964012] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 766.964012] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquired lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 766.964012] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 767.031292] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 767.818962] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Updating instance_info_cache with network_info: [{"id": "97081141-54df-4111-a454-195d22f7e34e", "address": "fa:16:3e:25:db:f2", "network": {"id": "92ab72ca-fd47-4f3c-bd45-3e01040ce0f6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1083960", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.202", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac563aa7-6d7c-4bd5-9241-7b3e11b8f22d", "external-id": "nsx-vlan-transportzone-730", "segmentation_id": 730, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97081141-54", "ovs_interfaceid": "97081141-54df-4111-a454-195d22f7e34e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3fa781fa-5a92-4049-b2b9-5e014576e241", "address": "fa:16:3e:2a:90:66", "network": {"id": "dea481f6-62c8-49dd-9ea7-b64313876b9b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1989233825", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0721b358-3768-472d-95f8-6d6755ab1635", "external-id": "nsx-vlan-transportzone-314", "segmentation_id": 314, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3fa781fa-5a", "ovs_interfaceid": "3fa781fa-5a92-4049-b2b9-5e014576e241", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 767.839629] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Releasing lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 767.840044] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance network_info: |[{"id": "97081141-54df-4111-a454-195d22f7e34e", "address": "fa:16:3e:25:db:f2", "network": {"id": "92ab72ca-fd47-4f3c-bd45-3e01040ce0f6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1083960", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.202", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac563aa7-6d7c-4bd5-9241-7b3e11b8f22d", "external-id": "nsx-vlan-transportzone-730", "segmentation_id": 730, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97081141-54", "ovs_interfaceid": "97081141-54df-4111-a454-195d22f7e34e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3fa781fa-5a92-4049-b2b9-5e014576e241", "address": "fa:16:3e:2a:90:66", "network": {"id": "dea481f6-62c8-49dd-9ea7-b64313876b9b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1989233825", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0721b358-3768-472d-95f8-6d6755ab1635", "external-id": "nsx-vlan-transportzone-314", "segmentation_id": 314, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3fa781fa-5a", "ovs_interfaceid": "3fa781fa-5a92-4049-b2b9-5e014576e241", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 767.840455] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:25:db:f2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac563aa7-6d7c-4bd5-9241-7b3e11b8f22d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '97081141-54df-4111-a454-195d22f7e34e', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:2a:90:66', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0721b358-3768-472d-95f8-6d6755ab1635', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3fa781fa-5a92-4049-b2b9-5e014576e241', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 767.850054] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Creating folder: Project (d4a80fb8d6b94a8c874d49ef38dc3169). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 767.850609] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3dfe05be-165b-43a8-96ab-7610e1ac600a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.861535] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Created folder: Project (d4a80fb8d6b94a8c874d49ef38dc3169) in parent group-v690228. [ 767.861715] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Creating folder: Instances. Parent ref: group-v690276. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 767.861930] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9eb31d0e-288f-45af-b6fa-73ef334fc702 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.870712] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Created folder: Instances in parent group-v690276. [ 767.870935] env[67131]: DEBUG oslo.service.loopingcall [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 767.871130] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 767.871351] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7488d8e5-e695-4d34-8f73-2150f73c66d7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.892077] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 767.892077] env[67131]: value = "task-3456460" [ 767.892077] env[67131]: _type = "Task" [ 767.892077] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 767.899445] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456460, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 768.403613] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456460, 'name': CreateVM_Task, 'duration_secs': 0.33267} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 768.403895] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 768.404531] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.404984] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.404984] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 768.405352] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-11aef5ab-c836-434b-95c3-11ca9ed10c2a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 768.410167] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Waiting for the task: (returnval){ [ 768.410167] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52b25e20-2b71-c9d7-9b54-17d5fc75d87d" [ 768.410167] env[67131]: _type = "Task" [ 768.410167] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 768.417113] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52b25e20-2b71-c9d7-9b54-17d5fc75d87d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 768.730976] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-changed-97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 768.731182] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Refreshing instance network info cache due to event network-changed-97081141-54df-4111-a454-195d22f7e34e. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 768.731471] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Acquiring lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.731615] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Acquired lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.731780] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Refreshing network info cache for port 97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 768.919443] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 768.919699] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 768.919907] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.471951] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Updated VIF entry in instance network info cache for port 97081141-54df-4111-a454-195d22f7e34e. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 769.472384] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Updating instance_info_cache with network_info: [{"id": "97081141-54df-4111-a454-195d22f7e34e", "address": "fa:16:3e:25:db:f2", "network": {"id": "92ab72ca-fd47-4f3c-bd45-3e01040ce0f6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1083960", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.202", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac563aa7-6d7c-4bd5-9241-7b3e11b8f22d", "external-id": "nsx-vlan-transportzone-730", "segmentation_id": 730, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97081141-54", "ovs_interfaceid": "97081141-54df-4111-a454-195d22f7e34e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3fa781fa-5a92-4049-b2b9-5e014576e241", "address": "fa:16:3e:2a:90:66", "network": {"id": "dea481f6-62c8-49dd-9ea7-b64313876b9b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1989233825", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0721b358-3768-472d-95f8-6d6755ab1635", "external-id": "nsx-vlan-transportzone-314", "segmentation_id": 314, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3fa781fa-5a", "ovs_interfaceid": "3fa781fa-5a92-4049-b2b9-5e014576e241", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.486087] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Releasing lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.486347] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-vif-plugged-3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 769.486535] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Acquiring lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.486731] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.486888] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.487061] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] No waiting events found dispatching network-vif-plugged-3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 769.487231] env[67131]: WARNING nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received unexpected event network-vif-plugged-3fa781fa-5a92-4049-b2b9-5e014576e241 for instance with vm_state building and task_state spawning. [ 769.487392] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-changed-3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 769.487545] env[67131]: DEBUG nova.compute.manager [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Refreshing instance network info cache due to event network-changed-3fa781fa-5a92-4049-b2b9-5e014576e241. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 769.487720] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Acquiring lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.487853] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Acquired lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.488010] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Refreshing network info cache for port 3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 770.007965] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Updated VIF entry in instance network info cache for port 3fa781fa-5a92-4049-b2b9-5e014576e241. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 770.009956] env[67131]: DEBUG nova.network.neutron [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Updating instance_info_cache with network_info: [{"id": "97081141-54df-4111-a454-195d22f7e34e", "address": "fa:16:3e:25:db:f2", "network": {"id": "92ab72ca-fd47-4f3c-bd45-3e01040ce0f6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1083960", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.202", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac563aa7-6d7c-4bd5-9241-7b3e11b8f22d", "external-id": "nsx-vlan-transportzone-730", "segmentation_id": 730, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap97081141-54", "ovs_interfaceid": "97081141-54df-4111-a454-195d22f7e34e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3fa781fa-5a92-4049-b2b9-5e014576e241", "address": "fa:16:3e:2a:90:66", "network": {"id": "dea481f6-62c8-49dd-9ea7-b64313876b9b", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1989233825", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.81", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d4a80fb8d6b94a8c874d49ef38dc3169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0721b358-3768-472d-95f8-6d6755ab1635", "external-id": "nsx-vlan-transportzone-314", "segmentation_id": 314, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3fa781fa-5a", "ovs_interfaceid": "3fa781fa-5a92-4049-b2b9-5e014576e241", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.018670] env[67131]: DEBUG oslo_concurrency.lockutils [req-df87a91b-7d3f-464b-b1d0-21f0f77b5c44 req-c8f21042-288f-4a32-80a2-2f2f3a150234 service nova] Releasing lock "refresh_cache-c8c85f1c-6876-4632-a2d6-a835912d3285" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 799.210592] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.210898] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.234269] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.465544] env[67131]: DEBUG nova.compute.manager [req-53de814c-ee41-4275-bd82-4a6137e00a12 req-356099b0-ee13-4d65-8b8a-e50f1856a598 service nova] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Received event network-vif-deleted-d41dc992-39ae-474a-9412-4ad545dafa0e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 800.216060] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.216458] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 801.216929] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.217191] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 801.217263] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 801.236755] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.236926] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.237380] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.237579] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.237714] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.237843] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.237976] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.238113] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.238234] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 801.238356] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 801.238861] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.972021] env[67131]: WARNING oslo_vmware.rw_handles [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 801.972021] env[67131]: ERROR oslo_vmware.rw_handles [ 801.972873] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 801.974201] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 801.974455] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Copying Virtual Disk [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/33019715-7e17-4b0c-888b-45e0f1335a7d/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 801.974736] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7c57faff-2991-49c3-a8f8-e8de19cc4641 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.983737] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Waiting for the task: (returnval){ [ 801.983737] env[67131]: value = "task-3456461" [ 801.983737] env[67131]: _type = "Task" [ 801.983737] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.993095] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Task: {'id': task-3456461, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.215921] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.216192] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.216359] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.244414] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.244655] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.244785] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 802.244941] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 802.246022] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6620c75-e65c-4ed7-809f-5512a5a4d8ac {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.254794] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca41b65f-4ea0-420f-989d-4f598b647f19 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.269698] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0919204-d82f-4d36-96bf-bf9776927ca0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.277207] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31fcfe77-94c3-48c8-8b4f-65c6ef0f1e40 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.306993] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180871MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 802.307196] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.307454] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.393358] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 47856710-dd0d-4d4a-9af4-ae3db29510e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.393562] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.393698] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.393781] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b47e3b03-7b84-4305-a55c-577401e5acf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.393902] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.394036] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.394158] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.394275] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.394422] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c8c85f1c-6876-4632-a2d6-a835912d3285 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.406044] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.416939] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.427439] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.438062] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8275c0cc-71d8-4e9c-a324-3955fe1a9943 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.447752] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 691eb0c7-b6f0-45ff-92fb-1e47d38587f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.458025] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 14293002-9e0b-4e4c-b4c5-9c726995dde0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.468507] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 61b77ab6-94d4-4a69-a2f5-b472215c46e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.479055] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.489983] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.496782] env[67131]: DEBUG oslo_vmware.exceptions [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 802.497069] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 802.497627] env[67131]: ERROR nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 802.497627] env[67131]: Faults: ['InvalidArgument'] [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Traceback (most recent call last): [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] yield resources [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] self.driver.spawn(context, instance, image_meta, [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] self._vmops.spawn(context, instance, image_meta, injected_files, [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] self._fetch_image_if_missing(context, vi) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] image_cache(vi, tmp_image_ds_loc) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] vm_util.copy_virtual_disk( [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] session._wait_for_task(vmdk_copy_task) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] return self.wait_for_task(task_ref) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] return evt.wait() [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] result = hub.switch() [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] return self.greenlet.switch() [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] self.f(*self.args, **self.kw) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] raise exceptions.translate_fault(task_info.error) [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Faults: ['InvalidArgument'] [ 802.497627] env[67131]: ERROR nova.compute.manager [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] [ 802.498484] env[67131]: INFO nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Terminating instance [ 802.499666] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 802.499871] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 802.500438] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c5368926-ed52-414f-9342-27c71e4e3557 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.501931] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 802.502188] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 802.502365] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2029090-5eb2-4d01-93eb-eb81f9fd80f9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.504924] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e51117b9-566b-47f2-ad99-83d362d8f929 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.510362] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b1b04cd3-c691-4689-a7c4-d97798668092 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.513816] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 802.515368] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e50c3754-7597-49fc-8446-5c7a5b9e2a1c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.516278] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 802.516448] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 802.517316] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-97f77b35-772c-42ad-b6ef-6c6adbecb73e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.520874] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 12dd51ad-bb48-4166-a208-0c8f6dd044fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 802.521110] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 802.521260] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 802.525020] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Waiting for the task: (returnval){ [ 802.525020] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a6a8cb-9100-a4d2-4fb1-f7bde5eb084d" [ 802.525020] env[67131]: _type = "Task" [ 802.525020] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.533016] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a6a8cb-9100-a4d2-4fb1-f7bde5eb084d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.592974] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 802.593227] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 802.593409] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Deleting the datastore file [datastore1] c9a491fe-aff4-4b4f-bcfb-dd56f1010576 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 802.593660] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dbfaee5c-763e-454e-a6e2-91ba29b14390 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.599585] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Waiting for the task: (returnval){ [ 802.599585] env[67131]: value = "task-3456463" [ 802.599585] env[67131]: _type = "Task" [ 802.599585] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.607635] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Task: {'id': task-3456463, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.776402] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccd3d618-9b06-4b9a-9016-c786c1d230b8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.783920] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57cad963-fa07-4017-b36a-8ebe7dbc99da {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.814599] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fb5cc11-3315-46bf-8602-77d97e3dcf20 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.821566] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-149da959-fa93-4a6f-b820-0e06f81993c8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.835509] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 802.843818] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 802.859252] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 802.859433] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.035611] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 803.035860] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Creating directory with path [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 803.036089] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d525adde-9899-4a49-9e64-d6a39d3329ca {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.048270] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Created directory with path [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 803.048468] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Fetch image to [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 803.048639] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 803.049387] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbc4eff7-4547-4fa4-8f14-71351d0bc677 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.055813] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0d2d498-85d5-4125-bda3-e5b3b325c211 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.064690] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bab713-e5f6-4163-a00e-160886d7ca2c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.094990] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f672a4b-b37b-422f-b114-bddfae2950ad {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.103248] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ec09297a-60da-494f-a8ce-31af43ce7f8d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.109024] env[67131]: DEBUG oslo_vmware.api [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Task: {'id': task-3456463, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079891} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 803.109349] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 803.109461] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 803.109638] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 803.109807] env[67131]: INFO nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Took 0.61 seconds to destroy the instance on the hypervisor. [ 803.111833] env[67131]: DEBUG nova.compute.claims [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 803.112007] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 803.112238] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.130281] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 803.143685] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.031s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.144994] env[67131]: DEBUG nova.compute.utils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance c9a491fe-aff4-4b4f-bcfb-dd56f1010576 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 803.145737] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 803.147681] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 803.147681] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 803.147681] env[67131]: DEBUG nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 803.147681] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 803.177671] env[67131]: DEBUG nova.network.neutron [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.187136] env[67131]: INFO nova.compute.manager [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Took 0.04 seconds to deallocate network for instance. [ 803.189753] env[67131]: DEBUG oslo_vmware.rw_handles [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 803.248785] env[67131]: DEBUG oslo_vmware.rw_handles [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 803.249151] env[67131]: DEBUG oslo_vmware.rw_handles [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 803.268106] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9f8292e3-1956-4574-9f70-253ba20ec633 tempest-ServerExternalEventsTest-539057350 tempest-ServerExternalEventsTest-539057350-project-member] Lock "c9a491fe-aff4-4b4f-bcfb-dd56f1010576" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.905s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.278459] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 803.327778] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 803.328038] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.329487] env[67131]: INFO nova.compute.claims [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 803.614239] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-746084eb-b6c7-468a-9a41-769157032cfa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.623890] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-382b9776-d38d-4757-ba33-757a25568473 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.652912] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf33450c-4144-474c-872d-27c8443485dc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.659717] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87391227-d118-46ba-9470-f6731642db58 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.672526] env[67131]: DEBUG nova.compute.provider_tree [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 803.680788] env[67131]: DEBUG nova.scheduler.client.report [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 803.693013] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.365s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.693482] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 803.725057] env[67131]: DEBUG nova.compute.utils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 803.726421] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 803.726589] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 803.734651] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 803.798902] env[67131]: DEBUG nova.policy [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f1a42dcf9604253bc665ce3f22f3575', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd417d609260447af960c693a66a87882', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 803.800824] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 803.823815] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 803.824091] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 803.824243] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 803.824414] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 803.824551] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 803.824803] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 803.825043] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 803.825208] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 803.825371] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 803.825537] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 803.825712] env[67131]: DEBUG nova.virt.hardware [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 803.826560] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7894f84f-d64e-46b8-9cb3-eacd8d6fb1a2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.834728] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48110d1e-e3ea-451b-a32f-a3032be29fdb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.858333] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.310691] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Successfully created port: a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 805.405314] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Successfully updated port: a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 805.414720] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 805.414920] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquired lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 805.415123] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 805.488085] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 805.972149] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Updating instance_info_cache with network_info: [{"id": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "address": "fa:16:3e:21:ad:19", "network": {"id": "44ca2137-4e36-49fc-af0b-d7ec768e8cb2", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-357974788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d417d609260447af960c693a66a87882", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9fdd994-c6", "ovs_interfaceid": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 805.988149] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Releasing lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 805.988449] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance network_info: |[{"id": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "address": "fa:16:3e:21:ad:19", "network": {"id": "44ca2137-4e36-49fc-af0b-d7ec768e8cb2", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-357974788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d417d609260447af960c693a66a87882", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9fdd994-c6", "ovs_interfaceid": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 805.988818] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:21:ad:19', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '55bd18a7-39a8-4d07-9088-9b944f9ff710', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a9fdd994-c62e-4f95-a34c-05aa741ca90e', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 805.996410] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Creating folder: Project (d417d609260447af960c693a66a87882). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 805.996981] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-96238dfd-ab81-4aad-b0c7-597d40662bd1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.007193] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Created folder: Project (d417d609260447af960c693a66a87882) in parent group-v690228. [ 806.007434] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Creating folder: Instances. Parent ref: group-v690279. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 806.007694] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-88b05c4d-53d4-4c54-a1cf-c3b1f8f17641 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.015844] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Created folder: Instances in parent group-v690279. [ 806.016164] env[67131]: DEBUG oslo.service.loopingcall [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 806.016376] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 806.016611] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7d9e43fb-af7b-49f1-bdd7-d0f645ac3ab8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.034967] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 806.034967] env[67131]: value = "task-3456466" [ 806.034967] env[67131]: _type = "Task" [ 806.034967] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 806.040060] env[67131]: DEBUG nova.compute.manager [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Received event network-vif-plugged-a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 806.040674] env[67131]: DEBUG oslo_concurrency.lockutils [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] Acquiring lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 806.040674] env[67131]: DEBUG oslo_concurrency.lockutils [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] Lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 806.040674] env[67131]: DEBUG oslo_concurrency.lockutils [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] Lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 806.040928] env[67131]: DEBUG nova.compute.manager [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] No waiting events found dispatching network-vif-plugged-a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 806.040928] env[67131]: WARNING nova.compute.manager [req-a1e821d4-e35c-44a9-883e-51282946d19b req-e2c40b3d-b6bb-49e1-a1bd-ef873699741c service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Received unexpected event network-vif-plugged-a9fdd994-c62e-4f95-a34c-05aa741ca90e for instance with vm_state building and task_state spawning. [ 806.045751] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456466, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 806.553087] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456466, 'name': CreateVM_Task} progress is 99%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 807.046498] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456466, 'name': CreateVM_Task} progress is 99%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 807.549586] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456466, 'name': CreateVM_Task, 'duration_secs': 1.310176} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 807.549827] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 807.550554] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 807.550809] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 807.551205] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 807.551526] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e28509a1-d5f0-41e3-ac89-6a7a987d8f31 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.558018] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Waiting for the task: (returnval){ [ 807.558018] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52773bf0-3ec8-d0c4-b469-03d349136b41" [ 807.558018] env[67131]: _type = "Task" [ 807.558018] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 807.567093] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52773bf0-3ec8-d0c4-b469-03d349136b41, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 808.066945] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 808.067412] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 808.067747] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 808.285264] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Received event network-changed-a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 808.285406] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Refreshing instance network info cache due to event network-changed-a9fdd994-c62e-4f95-a34c-05aa741ca90e. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 808.285616] env[67131]: DEBUG oslo_concurrency.lockutils [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] Acquiring lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 808.285755] env[67131]: DEBUG oslo_concurrency.lockutils [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] Acquired lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 808.285979] env[67131]: DEBUG nova.network.neutron [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Refreshing network info cache for port a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 808.369944] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.976414] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "765e5c4e-c893-41d2-9087-43294f24f5c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.976414] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "765e5c4e-c893-41d2-9087-43294f24f5c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.176145] env[67131]: DEBUG nova.network.neutron [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Updated VIF entry in instance network info cache for port a9fdd994-c62e-4f95-a34c-05aa741ca90e. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 809.177719] env[67131]: DEBUG nova.network.neutron [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Updating instance_info_cache with network_info: [{"id": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "address": "fa:16:3e:21:ad:19", "network": {"id": "44ca2137-4e36-49fc-af0b-d7ec768e8cb2", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-357974788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d417d609260447af960c693a66a87882", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9fdd994-c6", "ovs_interfaceid": "a9fdd994-c62e-4f95-a34c-05aa741ca90e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.192521] env[67131]: DEBUG oslo_concurrency.lockutils [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] Releasing lock "refresh_cache-4ce5668d-b588-4b92-bcc9-11d03eff2a84" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 809.192780] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Received event network-vif-deleted-e0fe0f11-70fb-43f4-b998-84f21897f4f2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.192962] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Received event network-vif-deleted-383b824e-b71e-46ba-bbfa-c35640e95719 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.193146] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Received event network-vif-deleted-fe2e0110-8252-4bb0-bf22-3534de8f8fe2 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.193316] env[67131]: DEBUG nova.compute.manager [req-b1e386fb-59ad-4db3-827f-7726a56c835b req-b6f657fa-3fc3-4a27-8210-941649833b71 service nova] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Received event network-vif-deleted-1a66cd40-0c6b-463a-942a-b066de9752e5 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 815.574164] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "2778d965-ad71-4239-b03a-214cd11b08ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.574164] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "2778d965-ad71-4239-b03a-214cd11b08ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.008687] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.008929] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 823.924839] env[67131]: DEBUG oslo_concurrency.lockutils [None req-8182dd17-09c3-4f1c-8c67-7b44c6b23b5c tempest-ServerAddressesNegativeTestJSON-2077313614 tempest-ServerAddressesNegativeTestJSON-2077313614-project-member] Acquiring lock "e0efe841-2ea3-4da4-973d-984dc5029baa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 823.925155] env[67131]: DEBUG oslo_concurrency.lockutils [None req-8182dd17-09c3-4f1c-8c67-7b44c6b23b5c tempest-ServerAddressesNegativeTestJSON-2077313614 tempest-ServerAddressesNegativeTestJSON-2077313614-project-member] Lock "e0efe841-2ea3-4da4-973d-984dc5029baa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 824.233991] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "28bf23c6-d36a-4822-9569-c825a7366ed4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 825.772930] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 849.730399] env[67131]: WARNING oslo_vmware.rw_handles [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 849.730399] env[67131]: ERROR oslo_vmware.rw_handles [ 849.731031] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 849.733604] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 849.733604] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Copying Virtual Disk [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/ae852705-1f67-4306-a1c6-3df407348273/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 849.733604] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-13450097-eea5-406d-8451-17f65ea3455b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.741190] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Waiting for the task: (returnval){ [ 849.741190] env[67131]: value = "task-3456467" [ 849.741190] env[67131]: _type = "Task" [ 849.741190] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 849.749149] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Task: {'id': task-3456467, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 850.251997] env[67131]: DEBUG oslo_vmware.exceptions [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 850.252353] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 850.252992] env[67131]: ERROR nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.252992] env[67131]: Faults: ['InvalidArgument'] [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Traceback (most recent call last): [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] yield resources [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] self.driver.spawn(context, instance, image_meta, [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] self._fetch_image_if_missing(context, vi) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] image_cache(vi, tmp_image_ds_loc) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] vm_util.copy_virtual_disk( [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] session._wait_for_task(vmdk_copy_task) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] return self.wait_for_task(task_ref) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] return evt.wait() [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] result = hub.switch() [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] return self.greenlet.switch() [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] self.f(*self.args, **self.kw) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] raise exceptions.translate_fault(task_info.error) [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Faults: ['InvalidArgument'] [ 850.252992] env[67131]: ERROR nova.compute.manager [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] [ 850.253933] env[67131]: INFO nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Terminating instance [ 850.254892] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 850.255115] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 850.255368] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bca831a9-e855-49e9-b0a6-c1d045b515b0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.257929] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 850.258222] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 850.258990] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae94a64-4760-424d-991f-0dd2651c916f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.265882] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 850.266173] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00083000-c782-46aa-be34-6db41382ab8a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.268477] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 850.268706] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 850.269623] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c38c5d80-c12b-4687-ac45-d6bc76c9072f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.274834] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Waiting for the task: (returnval){ [ 850.274834] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522eb4ae-bbc9-2661-48aa-15a9af1f1bc7" [ 850.274834] env[67131]: _type = "Task" [ 850.274834] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 850.287103] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522eb4ae-bbc9-2661-48aa-15a9af1f1bc7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 850.338070] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 850.338300] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 850.338478] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Deleting the datastore file [datastore1] 47856710-dd0d-4d4a-9af4-ae3db29510e9 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 850.338742] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bf7c72fe-2226-459d-b04a-01f10ac75539 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.345060] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Waiting for the task: (returnval){ [ 850.345060] env[67131]: value = "task-3456469" [ 850.345060] env[67131]: _type = "Task" [ 850.345060] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 850.353087] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Task: {'id': task-3456469, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 850.784932] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 850.785262] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Creating directory with path [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 850.785423] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6d3ec5d9-bca2-4535-aa55-7a3e275f44f8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.797267] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Created directory with path [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 850.797444] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Fetch image to [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 850.797672] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 850.798288] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a9c7a0-fa8f-42d8-95f7-029b9f526a03 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.805739] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-596ef699-c629-425e-8a34-40081536e700 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.814599] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3535400-85d8-4387-8dc0-a423e13c79c5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.844518] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbd80c4e-9796-4104-86d0-5b82dc3c9d2a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.854640] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8ee8fee3-6edb-4caf-b77a-5ce877eb3ce6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.856233] env[67131]: DEBUG oslo_vmware.api [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Task: {'id': task-3456469, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08439} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 850.856457] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 850.856626] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 850.856790] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 850.856958] env[67131]: INFO nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 850.859010] env[67131]: DEBUG nova.compute.claims [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 850.859187] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.859394] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.875941] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 850.887999] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.888734] env[67131]: DEBUG nova.compute.utils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance 47856710-dd0d-4d4a-9af4-ae3db29510e9 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 850.892038] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 850.892038] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 850.892226] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 850.892476] env[67131]: DEBUG nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 850.892476] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.917195] env[67131]: DEBUG nova.network.neutron [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.922873] env[67131]: DEBUG oslo_vmware.rw_handles [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 850.977660] env[67131]: INFO nova.compute.manager [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Took 0.09 seconds to deallocate network for instance. [ 850.982075] env[67131]: DEBUG oslo_vmware.rw_handles [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 850.982273] env[67131]: DEBUG oslo_vmware.rw_handles [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 851.020908] env[67131]: DEBUG oslo_concurrency.lockutils [None req-24b4280e-8b1c-44f7-aa80-4c3503ebe14f tempest-ImagesNegativeTestJSON-123294912 tempest-ImagesNegativeTestJSON-123294912-project-member] Lock "47856710-dd0d-4d4a-9af4-ae3db29510e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.213s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 851.029503] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 851.076297] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 851.076573] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 851.077990] env[67131]: INFO nova.compute.claims [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 851.336196] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f533b620-bad3-4b49-bc18-d838a8b4b69b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.343618] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6019702-3496-43d2-b2e5-0bbfb7011528 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.372618] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8593473-5d4a-4f3b-8573-0f2af2186e88 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.379897] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2853425-7499-41b9-9a75-589a007b04a5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.392862] env[67131]: DEBUG nova.compute.provider_tree [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 851.401367] env[67131]: DEBUG nova.scheduler.client.report [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 851.417288] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 851.417655] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 851.451285] env[67131]: DEBUG nova.compute.utils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 851.451778] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 851.451983] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 851.460093] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 851.523041] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 851.539256] env[67131]: DEBUG nova.policy [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28d70a310b37415f90d70cc76634c682', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58295faf384543e18474769b40dbeb22', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 851.548955] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 851.549263] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 851.549451] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 851.549637] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 851.549780] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 851.549925] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 851.550185] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 851.550348] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 851.550511] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 851.550669] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 851.550839] env[67131]: DEBUG nova.virt.hardware [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 851.551701] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28adc39c-edd2-4d8b-9055-048b985ea808 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.559494] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cdbd00f-d5f8-47e6-ab6b-998df003804c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.987645] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Successfully created port: 3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 853.016928] env[67131]: DEBUG nova.compute.manager [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Received event network-vif-plugged-3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 853.017289] env[67131]: DEBUG oslo_concurrency.lockutils [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] Acquiring lock "3b2e1650-ee7f-46a2-94db-1a611384be03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 853.017490] env[67131]: DEBUG oslo_concurrency.lockutils [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] Lock "3b2e1650-ee7f-46a2-94db-1a611384be03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 853.017624] env[67131]: DEBUG oslo_concurrency.lockutils [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] Lock "3b2e1650-ee7f-46a2-94db-1a611384be03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 853.017792] env[67131]: DEBUG nova.compute.manager [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] No waiting events found dispatching network-vif-plugged-3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 853.017953] env[67131]: WARNING nova.compute.manager [req-bdfbab70-a298-4db2-b01f-81dd798eed31 req-0dd5338b-3056-468c-89fa-00d76e5d5279 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Received unexpected event network-vif-plugged-3f2dbe5c-1f5a-4056-be85-5f237f83552c for instance with vm_state building and task_state spawning. [ 853.070532] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Successfully updated port: 3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 853.080292] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 853.080438] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquired lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 853.080581] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 853.154085] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 853.424923] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Updating instance_info_cache with network_info: [{"id": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "address": "fa:16:3e:8e:09:5a", "network": {"id": "b79f936c-e8ae-43cb-94a7-b66efa7c1380", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1297831731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58295faf384543e18474769b40dbeb22", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f2dbe5c-1f", "ovs_interfaceid": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 853.435402] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Releasing lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 853.435668] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance network_info: |[{"id": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "address": "fa:16:3e:8e:09:5a", "network": {"id": "b79f936c-e8ae-43cb-94a7-b66efa7c1380", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1297831731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58295faf384543e18474769b40dbeb22", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f2dbe5c-1f", "ovs_interfaceid": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 853.436053] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8e:09:5a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ccc845e3-654b-43c6-acea-dde1084f0ad0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3f2dbe5c-1f5a-4056-be85-5f237f83552c', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 853.443589] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Creating folder: Project (58295faf384543e18474769b40dbeb22). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 853.443904] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db42d818-ab05-4f60-9688-dda1ba125f5d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 853.454413] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Created folder: Project (58295faf384543e18474769b40dbeb22) in parent group-v690228. [ 853.454596] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Creating folder: Instances. Parent ref: group-v690282. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 853.454802] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d7fa2620-8e58-41c0-9276-e6a602fa985a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 853.463595] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Created folder: Instances in parent group-v690282. [ 853.463805] env[67131]: DEBUG oslo.service.loopingcall [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 853.463971] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 853.464207] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0f018b4b-9f4d-4687-a696-11fce148e0b1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 853.481744] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 853.481744] env[67131]: value = "task-3456472" [ 853.481744] env[67131]: _type = "Task" [ 853.481744] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 853.491647] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456472, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 853.992684] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456472, 'name': CreateVM_Task} progress is 99%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 854.493310] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456472, 'name': CreateVM_Task} progress is 99%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 854.994296] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456472, 'name': CreateVM_Task, 'duration_secs': 1.276509} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 854.994491] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 854.995257] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 854.995423] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 854.995727] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 854.995975] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-73c20bd4-6488-4785-9ef2-5ff88d43d3d3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.000378] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Waiting for the task: (returnval){ [ 855.000378] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5245f8c9-144c-66d0-5da7-651b04a4adae" [ 855.000378] env[67131]: _type = "Task" [ 855.000378] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 855.008104] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5245f8c9-144c-66d0-5da7-651b04a4adae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 855.092547] env[67131]: DEBUG nova.compute.manager [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Received event network-changed-3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 855.092710] env[67131]: DEBUG nova.compute.manager [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Refreshing instance network info cache due to event network-changed-3f2dbe5c-1f5a-4056-be85-5f237f83552c. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 855.093069] env[67131]: DEBUG oslo_concurrency.lockutils [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] Acquiring lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 855.093266] env[67131]: DEBUG oslo_concurrency.lockutils [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] Acquired lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 855.093441] env[67131]: DEBUG nova.network.neutron [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Refreshing network info cache for port 3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 855.336031] env[67131]: DEBUG nova.network.neutron [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Updated VIF entry in instance network info cache for port 3f2dbe5c-1f5a-4056-be85-5f237f83552c. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 855.336163] env[67131]: DEBUG nova.network.neutron [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Updating instance_info_cache with network_info: [{"id": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "address": "fa:16:3e:8e:09:5a", "network": {"id": "b79f936c-e8ae-43cb-94a7-b66efa7c1380", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1297831731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58295faf384543e18474769b40dbeb22", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f2dbe5c-1f", "ovs_interfaceid": "3f2dbe5c-1f5a-4056-be85-5f237f83552c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 855.345297] env[67131]: DEBUG oslo_concurrency.lockutils [req-78e17089-61a8-4481-8a79-0e1716da0d86 req-8f59d86f-f862-409d-b054-0d1e7476ea44 service nova] Releasing lock "refresh_cache-3b2e1650-ee7f-46a2-94db-1a611384be03" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 855.511264] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 855.511606] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 855.511696] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 858.216278] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 858.216535] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Cleaning up deleted instances {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 858.237317] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] There are 5 instances to clean {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 858.237608] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance has had 0 of 5 cleanup attempts {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.290615] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance has had 0 of 5 cleanup attempts {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.333012] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance has had 0 of 5 cleanup attempts {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.359594] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 47856710-dd0d-4d4a-9af4-ae3db29510e9] Instance has had 0 of 5 cleanup attempts {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.382977] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c9a491fe-aff4-4b4f-bcfb-dd56f1010576] Instance has had 0 of 5 cleanup attempts {{(pid=67131) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.402745] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 858.402936] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Cleaning up deleted instances with incomplete migration {{(pid=67131) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 858.411979] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 860.417557] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 861.211509] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 861.215108] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 861.215270] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 861.215391] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 861.231975] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232183] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232296] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232421] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232542] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232659] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232775] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 861.232972] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 861.233557] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.215193] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.215489] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.215604] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 863.215051] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 863.226963] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.227293] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.227365] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.227519] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 863.228582] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcdc70b3-d358-459d-b1ab-1f88935886ef {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.237430] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7954e3ae-e037-473e-b747-c0257f902ea7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.251443] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a52f0b7e-0613-4617-9c93-34329e97cf43 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.257603] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eed329fa-9bb6-498b-a258-a3e2ab4c7c7e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.287345] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180891MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 863.287475] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.287658] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.433778] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.433944] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.434095] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.434221] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.434352] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c8c85f1c-6876-4632-a2d6-a835912d3285 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.434473] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.434592] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 863.446443] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.457078] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8275c0cc-71d8-4e9c-a324-3955fe1a9943 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.468653] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 691eb0c7-b6f0-45ff-92fb-1e47d38587f6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.479419] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 14293002-9e0b-4e4c-b4c5-9c726995dde0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.489693] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 61b77ab6-94d4-4a69-a2f5-b472215c46e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.499042] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.508707] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.518185] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance c5368926-ed52-414f-9342-27c71e4e3557 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.528505] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance b1b04cd3-c691-4689-a7c4-d97798668092 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.538336] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 12dd51ad-bb48-4166-a208-0c8f6dd044fe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.547553] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 765e5c4e-c893-41d2-9087-43294f24f5c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.557622] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2778d965-ad71-4239-b03a-214cd11b08ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.566972] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.576894] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance e0efe841-2ea3-4da4-973d-984dc5029baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 863.577295] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 863.577481] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 863.595201] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Refreshing inventories for resource provider d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 863.608801] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Updating ProviderTree inventory for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 863.608998] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Updating inventory in ProviderTree for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 863.619687] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Refreshing aggregate associations for resource provider d05f24fe-4395-4079-99ef-1ac1245f55e5, aggregates: None {{(pid=67131) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 863.635780] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Refreshing trait associations for resource provider d05f24fe-4395-4079-99ef-1ac1245f55e5, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67131) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 863.874941] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fce4c7e-7a07-4d73-884b-916080b6e087 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.882425] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d2ed9fd-727a-4287-935d-9ce338625147 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.913085] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-556f213e-41dd-4d16-9fdf-c9a215897156 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.920092] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcb18a15-8552-452b-a178-74d737dae1bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.933686] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 863.942278] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 863.955527] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 863.955678] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.668s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.957218] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 865.215675] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 886.242247] env[67131]: DEBUG nova.compute.manager [req-b0e08883-2bb6-450d-b832-a122492f70fa req-49e1f726-d8e5-40b9-8c53-b27c765d4178 service nova] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Received event network-vif-deleted-5606f9ef-2dff-4b18-b3fd-b9b295425170 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 895.642318] env[67131]: DEBUG nova.compute.manager [req-c5eb5578-1750-4ff9-b414-de3de576f069 req-a0e0c4aa-137f-494c-bb68-4a64573364af service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-vif-deleted-3fa781fa-5a92-4049-b2b9-5e014576e241 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 898.027341] env[67131]: DEBUG nova.compute.manager [req-656ed019-3f8e-4b05-809c-a7e071c36634 req-f6273d0d-4494-4bbe-88f2-c0255d040485 service nova] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Received event network-vif-deleted-a9fdd994-c62e-4f95-a34c-05aa741ca90e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 898.027611] env[67131]: DEBUG nova.compute.manager [req-656ed019-3f8e-4b05-809c-a7e071c36634 req-f6273d0d-4494-4bbe-88f2-c0255d040485 service nova] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Received event network-vif-deleted-97081141-54df-4111-a454-195d22f7e34e {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 898.027711] env[67131]: DEBUG nova.compute.manager [req-656ed019-3f8e-4b05-809c-a7e071c36634 req-f6273d0d-4494-4bbe-88f2-c0255d040485 service nova] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Received event network-vif-deleted-3f2dbe5c-1f5a-4056-be85-5f237f83552c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 900.145507] env[67131]: WARNING oslo_vmware.rw_handles [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 900.145507] env[67131]: ERROR oslo_vmware.rw_handles [ 900.146150] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 900.147641] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 900.147890] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Copying Virtual Disk [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/69b28c19-4a05-4ebc-882d-e32447108db5/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 900.148416] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f9a9c334-87bf-4045-a137-e43d11f62dfb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.156995] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Waiting for the task: (returnval){ [ 900.156995] env[67131]: value = "task-3456473" [ 900.156995] env[67131]: _type = "Task" [ 900.156995] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.166390] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Task: {'id': task-3456473, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 900.668816] env[67131]: DEBUG oslo_vmware.exceptions [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 900.668816] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 900.670380] env[67131]: ERROR nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 900.670380] env[67131]: Faults: ['InvalidArgument'] [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Traceback (most recent call last): [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] yield resources [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] self.driver.spawn(context, instance, image_meta, [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] self._vmops.spawn(context, instance, image_meta, injected_files, [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] self._fetch_image_if_missing(context, vi) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] image_cache(vi, tmp_image_ds_loc) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] vm_util.copy_virtual_disk( [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] session._wait_for_task(vmdk_copy_task) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] return self.wait_for_task(task_ref) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] return evt.wait() [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] result = hub.switch() [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] return self.greenlet.switch() [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] self.f(*self.args, **self.kw) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] raise exceptions.translate_fault(task_info.error) [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Faults: ['InvalidArgument'] [ 900.670380] env[67131]: ERROR nova.compute.manager [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] [ 900.670380] env[67131]: INFO nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Terminating instance [ 900.671659] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 900.671659] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 900.671803] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c42af65-c542-4fc4-9e9a-506a9cfd6f89 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.675482] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 900.675660] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 900.676424] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83c4e386-e4f7-4b2e-aef3-41fbab3093cb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.683612] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 900.683837] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-69e2e156-7b88-42bb-ac62-7369fe53b459 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.686212] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 900.686372] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 900.687343] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d93c1bf9-b6ae-4d8d-a1cb-aa11c70754cd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.695941] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 900.695941] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529d4f88-f578-16fa-1763-9c1df3eeddf5" [ 900.695941] env[67131]: _type = "Task" [ 900.695941] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.704150] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529d4f88-f578-16fa-1763-9c1df3eeddf5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 900.761693] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 900.761816] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 900.761989] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Deleting the datastore file [datastore1] f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 900.762296] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d01c1aee-ea14-435c-96e4-77472d6cdf93 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.771214] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Waiting for the task: (returnval){ [ 900.771214] env[67131]: value = "task-3456475" [ 900.771214] env[67131]: _type = "Task" [ 900.771214] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.780615] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Task: {'id': task-3456475, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.208409] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 901.209058] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating directory with path [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 901.211044] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-21fc9454-aca6-4f24-97a2-be907d709044 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.228529] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created directory with path [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 901.228892] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Fetch image to [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 901.229216] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 901.230117] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd14391-fa5f-45ea-823f-1f392a62bafe {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.243398] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec85b4f-c315-4d63-a55a-2b88281adfc9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.253695] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53630ed2-8920-443e-b15d-3978fb4f86e8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.294470] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffa75269-1734-4e29-a6d3-e304d388c239 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.305112] env[67131]: DEBUG oslo_vmware.api [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Task: {'id': task-3456475, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073866} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 901.305374] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 901.305555] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 901.305723] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 901.305891] env[67131]: INFO nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Took 0.63 seconds to destroy the instance on the hypervisor. [ 901.309626] env[67131]: DEBUG nova.compute.claims [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 901.310252] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.310252] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.312749] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0380762a-01d5-4d49-bdcf-aa5a9312e8d2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.335584] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 901.342460] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.031s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.342460] env[67131]: DEBUG nova.compute.utils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 901.342890] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 901.342978] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 901.343323] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 901.343507] env[67131]: DEBUG nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 901.343666] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 901.374276] env[67131]: DEBUG nova.network.neutron [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 901.384566] env[67131]: DEBUG oslo_vmware.rw_handles [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 901.444096] env[67131]: INFO nova.compute.manager [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] [instance: f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605] Took 0.10 seconds to deallocate network for instance. [ 901.448514] env[67131]: DEBUG oslo_vmware.rw_handles [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 901.448718] env[67131]: DEBUG oslo_vmware.rw_handles [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 901.508431] env[67131]: DEBUG oslo_concurrency.lockutils [None req-de70822d-dbdd-4957-973a-439577c052de tempest-ImagesOneServerNegativeTestJSON-987996923 tempest-ImagesOneServerNegativeTestJSON-987996923-project-member] Lock "f2ce3bf9-e6fe-4b9b-8af1-5c18b4a7b605" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.378s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.520991] env[67131]: DEBUG nova.compute.manager [None req-2a5dfae5-6b8c-4294-8af0-4af23a7ddbe8 tempest-AttachInterfacesTestJSON-2011988282 tempest-AttachInterfacesTestJSON-2011988282-project-member] [instance: ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.554548] env[67131]: DEBUG nova.compute.manager [None req-2a5dfae5-6b8c-4294-8af0-4af23a7ddbe8 tempest-AttachInterfacesTestJSON-2011988282 tempest-AttachInterfacesTestJSON-2011988282-project-member] [instance: ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.579812] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a5dfae5-6b8c-4294-8af0-4af23a7ddbe8 tempest-AttachInterfacesTestJSON-2011988282 tempest-AttachInterfacesTestJSON-2011988282-project-member] Lock "ad0b7a66-fcec-4aa9-8a86-aaa24ad9698c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.477s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.598431] env[67131]: DEBUG nova.compute.manager [None req-b8b17b98-2599-47e2-a17a-4d60b612169a tempest-FloatingIPsAssociationNegativeTestJSON-324741952 tempest-FloatingIPsAssociationNegativeTestJSON-324741952-project-member] [instance: 8275c0cc-71d8-4e9c-a324-3955fe1a9943] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.625147] env[67131]: DEBUG nova.compute.manager [None req-b8b17b98-2599-47e2-a17a-4d60b612169a tempest-FloatingIPsAssociationNegativeTestJSON-324741952 tempest-FloatingIPsAssociationNegativeTestJSON-324741952-project-member] [instance: 8275c0cc-71d8-4e9c-a324-3955fe1a9943] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.649422] env[67131]: DEBUG oslo_concurrency.lockutils [None req-b8b17b98-2599-47e2-a17a-4d60b612169a tempest-FloatingIPsAssociationNegativeTestJSON-324741952 tempest-FloatingIPsAssociationNegativeTestJSON-324741952-project-member] Lock "8275c0cc-71d8-4e9c-a324-3955fe1a9943" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.410s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.663263] env[67131]: DEBUG nova.compute.manager [None req-16e75bcb-ec0c-4d58-a17c-cf2065d7f4eb tempest-AttachVolumeNegativeTest-1306521917 tempest-AttachVolumeNegativeTest-1306521917-project-member] [instance: 691eb0c7-b6f0-45ff-92fb-1e47d38587f6] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.697668] env[67131]: DEBUG nova.compute.manager [None req-16e75bcb-ec0c-4d58-a17c-cf2065d7f4eb tempest-AttachVolumeNegativeTest-1306521917 tempest-AttachVolumeNegativeTest-1306521917-project-member] [instance: 691eb0c7-b6f0-45ff-92fb-1e47d38587f6] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.724486] env[67131]: DEBUG oslo_concurrency.lockutils [None req-16e75bcb-ec0c-4d58-a17c-cf2065d7f4eb tempest-AttachVolumeNegativeTest-1306521917 tempest-AttachVolumeNegativeTest-1306521917-project-member] Lock "691eb0c7-b6f0-45ff-92fb-1e47d38587f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.787s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.736897] env[67131]: DEBUG nova.compute.manager [None req-42c2489f-75b5-4752-9a2f-cd4c35a7dc3f tempest-ServerGroupTestJSON-306088605 tempest-ServerGroupTestJSON-306088605-project-member] [instance: 14293002-9e0b-4e4c-b4c5-9c726995dde0] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.759258] env[67131]: DEBUG nova.compute.manager [None req-42c2489f-75b5-4752-9a2f-cd4c35a7dc3f tempest-ServerGroupTestJSON-306088605 tempest-ServerGroupTestJSON-306088605-project-member] [instance: 14293002-9e0b-4e4c-b4c5-9c726995dde0] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.781779] env[67131]: DEBUG oslo_concurrency.lockutils [None req-42c2489f-75b5-4752-9a2f-cd4c35a7dc3f tempest-ServerGroupTestJSON-306088605 tempest-ServerGroupTestJSON-306088605-project-member] Lock "14293002-9e0b-4e4c-b4c5-9c726995dde0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.368s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.791956] env[67131]: DEBUG nova.compute.manager [None req-ba335090-d472-4f0e-92f7-65ea60eedd52 tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] [instance: 61b77ab6-94d4-4a69-a2f5-b472215c46e7] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.823477] env[67131]: DEBUG nova.compute.manager [None req-ba335090-d472-4f0e-92f7-65ea60eedd52 tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] [instance: 61b77ab6-94d4-4a69-a2f5-b472215c46e7] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.844803] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba335090-d472-4f0e-92f7-65ea60eedd52 tempest-ServersTestJSON-1541738038 tempest-ServersTestJSON-1541738038-project-member] Lock "61b77ab6-94d4-4a69-a2f5-b472215c46e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.949s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.854147] env[67131]: DEBUG nova.compute.manager [None req-fea2e719-11ff-4518-a3c6-b705cf9c6ed1 tempest-ServersNegativeTestJSON-641837042 tempest-ServersNegativeTestJSON-641837042-project-member] [instance: fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.877846] env[67131]: DEBUG nova.compute.manager [None req-fea2e719-11ff-4518-a3c6-b705cf9c6ed1 tempest-ServersNegativeTestJSON-641837042 tempest-ServersNegativeTestJSON-641837042-project-member] [instance: fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.902452] env[67131]: DEBUG oslo_concurrency.lockutils [None req-fea2e719-11ff-4518-a3c6-b705cf9c6ed1 tempest-ServersNegativeTestJSON-641837042 tempest-ServersNegativeTestJSON-641837042-project-member] Lock "fd9b8dbe-0dc9-4072-93b3-6a06fdd1cd25" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.196s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.913988] env[67131]: DEBUG nova.compute.manager [None req-6ae82b6f-0fb4-4404-82c0-39de16f44410 tempest-ServersTestBootFromVolume-1447370321 tempest-ServersTestBootFromVolume-1447370321-project-member] [instance: d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 901.938884] env[67131]: DEBUG nova.compute.manager [None req-6ae82b6f-0fb4-4404-82c0-39de16f44410 tempest-ServersTestBootFromVolume-1447370321 tempest-ServersTestBootFromVolume-1447370321-project-member] [instance: d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 901.964166] env[67131]: DEBUG oslo_concurrency.lockutils [None req-6ae82b6f-0fb4-4404-82c0-39de16f44410 tempest-ServersTestBootFromVolume-1447370321 tempest-ServersTestBootFromVolume-1447370321-project-member] Lock "d3ea7ca6-6832-4e5c-ae89-40eb286c8bd9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.015s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 901.973552] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 902.028224] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 902.028471] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 902.030019] env[67131]: INFO nova.compute.claims [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 902.229108] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f1a3e5d-8d35-49bf-93ef-01c37aaa6a15 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.236999] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b9cf51c-a6d7-4405-9149-a365d8d54164 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.274221] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-857a77e4-d54b-49de-8bcf-e9e29c7f14fa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.283349] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-603db8f0-2dcd-40cd-ac37-37eea5063236 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.297215] env[67131]: DEBUG nova.compute.provider_tree [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 902.306888] env[67131]: DEBUG nova.scheduler.client.report [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 902.325400] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 902.325617] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 902.372247] env[67131]: DEBUG nova.compute.utils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 902.374138] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 902.374312] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 902.385650] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 902.465519] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 902.490312] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 902.490589] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 902.490744] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 902.490922] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 902.491183] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 902.491368] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 902.491826] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 902.492028] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 902.492279] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 902.492446] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 902.492619] env[67131]: DEBUG nova.virt.hardware [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 902.493538] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f02a99d0-f847-465f-a2ed-91da0239f3ce {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.502862] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15f75016-e5cc-4921-97ae-e037d1a56031 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.531523] env[67131]: DEBUG nova.policy [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4592b7eabfae410eab52681de20c3dc2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c82c1460f5da435b80fb50a8e3c7deba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 903.621812] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Successfully created port: 62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 905.687530] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Successfully updated port: 62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 905.702016] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 905.702180] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquired lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 905.702354] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 906.231146] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 906.265313] env[67131]: DEBUG nova.compute.manager [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Received event network-vif-plugged-62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 906.265565] env[67131]: DEBUG oslo_concurrency.lockutils [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] Acquiring lock "c5368926-ed52-414f-9342-27c71e4e3557-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 906.265775] env[67131]: DEBUG oslo_concurrency.lockutils [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] Lock "c5368926-ed52-414f-9342-27c71e4e3557-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 906.265936] env[67131]: DEBUG oslo_concurrency.lockutils [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] Lock "c5368926-ed52-414f-9342-27c71e4e3557-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 906.267588] env[67131]: DEBUG nova.compute.manager [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] No waiting events found dispatching network-vif-plugged-62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 906.268124] env[67131]: WARNING nova.compute.manager [req-fe568d2a-fbb8-4d81-a535-a4994850cedf req-9fb3888c-ea74-4735-9314-731dc4f98f4b service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Received unexpected event network-vif-plugged-62b502cf-fd8f-43c0-9aaa-64e434de61c7 for instance with vm_state building and task_state spawning. [ 906.769755] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Updating instance_info_cache with network_info: [{"id": "62b502cf-fd8f-43c0-9aaa-64e434de61c7", "address": "fa:16:3e:24:fd:9b", "network": {"id": "cdbd3b83-4e0c-4305-86e4-c0ea4629746f", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-111978579-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c82c1460f5da435b80fb50a8e3c7deba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62b502cf-fd", "ovs_interfaceid": "62b502cf-fd8f-43c0-9aaa-64e434de61c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 906.786217] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Releasing lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 906.786498] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance network_info: |[{"id": "62b502cf-fd8f-43c0-9aaa-64e434de61c7", "address": "fa:16:3e:24:fd:9b", "network": {"id": "cdbd3b83-4e0c-4305-86e4-c0ea4629746f", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-111978579-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c82c1460f5da435b80fb50a8e3c7deba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap62b502cf-fd", "ovs_interfaceid": "62b502cf-fd8f-43c0-9aaa-64e434de61c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 906.786841] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:fd:9b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '62b502cf-fd8f-43c0-9aaa-64e434de61c7', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 906.795162] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Creating folder: Project (c82c1460f5da435b80fb50a8e3c7deba). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 906.795731] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e6ae1f2-f029-483c-b747-74aae45ac037 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.808030] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Created folder: Project (c82c1460f5da435b80fb50a8e3c7deba) in parent group-v690228. [ 906.808030] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Creating folder: Instances. Parent ref: group-v690285. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 906.808030] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3c975fa-e46d-46fd-b2b2-2355eb141d31 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.819647] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Created folder: Instances in parent group-v690285. [ 906.819901] env[67131]: DEBUG oslo.service.loopingcall [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 906.820135] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 906.820610] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-64231582-eecb-46c7-a69d-b6e03821d553 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.839788] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 906.839788] env[67131]: value = "task-3456478" [ 906.839788] env[67131]: _type = "Task" [ 906.839788] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 906.848702] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456478, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.353109] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456478, 'name': CreateVM_Task, 'duration_secs': 0.322738} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 907.353109] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 907.353109] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 907.353109] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.353109] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 907.353109] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-668194dc-b9ba-4f95-9453-c4c9f017917a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.358821] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Waiting for the task: (returnval){ [ 907.358821] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5254dfe2-d70f-bfdf-ab29-ca57dca1c9d0" [ 907.358821] env[67131]: _type = "Task" [ 907.358821] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 907.369795] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5254dfe2-d70f-bfdf-ab29-ca57dca1c9d0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.873014] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 907.875108] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 907.875108] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 908.803661] env[67131]: DEBUG nova.compute.manager [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Received event network-changed-62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 908.803859] env[67131]: DEBUG nova.compute.manager [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Refreshing instance network info cache due to event network-changed-62b502cf-fd8f-43c0-9aaa-64e434de61c7. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 908.804079] env[67131]: DEBUG oslo_concurrency.lockutils [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] Acquiring lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 908.805711] env[67131]: DEBUG oslo_concurrency.lockutils [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] Acquired lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 908.805711] env[67131]: DEBUG nova.network.neutron [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Refreshing network info cache for port 62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 908.835055] env[67131]: DEBUG nova.network.neutron [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 909.115542] env[67131]: DEBUG nova.network.neutron [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance is deleted, no further info cache update {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 909.115904] env[67131]: DEBUG oslo_concurrency.lockutils [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] Releasing lock "refresh_cache-c5368926-ed52-414f-9342-27c71e4e3557" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 909.115951] env[67131]: DEBUG nova.compute.manager [req-6b6efa62-bbbe-482e-9ef7-2396f864e194 req-2cd38308-dcd6-4fea-b18c-ed343c056502 service nova] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Received event network-vif-deleted-62b502cf-fd8f-43c0-9aaa-64e434de61c7 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 919.214436] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 921.216049] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 921.216318] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 922.215230] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 922.215423] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 922.215548] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 922.231690] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 922.231933] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 922.231978] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 922.232120] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 922.232630] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 922.232768] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 923.227727] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.215676] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.216282] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.227538] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 924.227774] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 924.227933] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 924.228159] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 924.231786] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aef5bf9-0603-41f1-814f-de59df5bf1dd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.247035] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5d4b7b-29c9-48ea-8c2e-c4a42b7a6472 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.267341] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e26198e3-a6db-4676-889b-bf741e019826 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.275759] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5116eb62-1b93-4bd7-ab55-8c340a815f0b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.311572] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180881MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 924.311807] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 924.312024] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 924.368681] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 924.368830] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 924.368954] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 924.383241] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 765e5c4e-c893-41d2-9087-43294f24f5c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 924.399685] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2778d965-ad71-4239-b03a-214cd11b08ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 924.447490] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 924.467024] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance e0efe841-2ea3-4da4-973d-984dc5029baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 924.467024] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 924.467024] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 924.600450] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e30b68-41da-40fc-9ccd-75e76ed63f05 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.609028] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08bd7dc4-b71b-4b1d-9afc-07dff01480b8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.639473] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f7f06e-b7fc-49a7-adc0-05bee0a54c2e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.648115] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677b498b-4b83-4130-8140-b3ec27212b27 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.663879] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 924.675791] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 924.717976] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 924.718199] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.406s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 926.718569] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 927.215727] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.091313] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquiring lock "2fd6ec26-9e42-43fd-a09c-de43a8107aee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 928.091617] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Lock "2fd6ec26-9e42-43fd-a09c-de43a8107aee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.820686] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquiring lock "0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.820960] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Lock "0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.852719] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquiring lock "8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.852719] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Lock "8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.023575] env[67131]: WARNING oslo_vmware.rw_handles [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 947.023575] env[67131]: ERROR oslo_vmware.rw_handles [ 947.024305] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 947.025900] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 947.026173] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Copying Virtual Disk [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/4f9036f5-973c-4229-a16f-3cc5aebc9fa9/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 947.026469] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-129139d4-2bb2-41ce-ab9a-6ba9ed1ada79 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.033744] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 947.033744] env[67131]: value = "task-3456479" [ 947.033744] env[67131]: _type = "Task" [ 947.033744] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.041595] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456479, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.547082] env[67131]: DEBUG oslo_vmware.exceptions [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 947.547368] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 947.547910] env[67131]: ERROR nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.547910] env[67131]: Faults: ['InvalidArgument'] [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Traceback (most recent call last): [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] yield resources [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] self.driver.spawn(context, instance, image_meta, [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] self._fetch_image_if_missing(context, vi) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] image_cache(vi, tmp_image_ds_loc) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] vm_util.copy_virtual_disk( [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] session._wait_for_task(vmdk_copy_task) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] return self.wait_for_task(task_ref) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] return evt.wait() [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] result = hub.switch() [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] return self.greenlet.switch() [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] self.f(*self.args, **self.kw) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] raise exceptions.translate_fault(task_info.error) [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Faults: ['InvalidArgument'] [ 947.547910] env[67131]: ERROR nova.compute.manager [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] [ 947.548761] env[67131]: INFO nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Terminating instance [ 947.549794] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 947.550036] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.550266] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e046536-35b8-4ec7-aac9-b51a2ebea8da {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.552351] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 947.552576] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 947.553305] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbbe0dd4-d2e9-447a-adda-2e617164c57c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.560053] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 947.560271] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-851eb825-b7fa-467d-8811-b11cf26aa6e0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.562298] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.562503] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 947.563402] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cd67cc7d-1c8e-46d1-bfd7-c7f8e3213454 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.567795] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Waiting for the task: (returnval){ [ 947.567795] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529456df-4abf-1e2e-b6ca-45eb5b71d688" [ 947.567795] env[67131]: _type = "Task" [ 947.567795] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.577893] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]529456df-4abf-1e2e-b6ca-45eb5b71d688, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.631337] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 947.631560] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 947.631740] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Deleting the datastore file [datastore1] fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 947.631999] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-75afc75d-3e9a-4ed3-98e5-d42dcc5472c6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.639050] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 947.639050] env[67131]: value = "task-3456481" [ 947.639050] env[67131]: _type = "Task" [ 947.639050] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.645448] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456481, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.077540] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 948.077791] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Creating directory with path [datastore1] vmware_temp/e2f1f390-1eef-439c-b756-1f5a8c1f69b3/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.078009] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cef92aeb-7d64-467c-bec2-82bc9c33bc63 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.089332] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Created directory with path [datastore1] vmware_temp/e2f1f390-1eef-439c-b756-1f5a8c1f69b3/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.089511] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Fetch image to [datastore1] vmware_temp/e2f1f390-1eef-439c-b756-1f5a8c1f69b3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 948.089678] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/e2f1f390-1eef-439c-b756-1f5a8c1f69b3/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 948.090412] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0297a40-49d6-4240-85cc-8e9a526d21ed {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.096750] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ad766c1-02bf-463b-b46a-4a43c027579c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.106373] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-891a5caa-0e5c-41a3-8b01-b9d67906ac35 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.136101] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aac4e0f7-a436-4893-99ee-9816ddd6ea57 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.142845] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c930ae7b-96c8-481d-8657-55558cb38399 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.146956] env[67131]: DEBUG oslo_vmware.api [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456481, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.097614} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.147452] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.147662] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.147846] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.148027] env[67131]: INFO nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 948.149993] env[67131]: DEBUG nova.compute.claims [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 948.150206] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.150414] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.165249] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 948.177369] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.178029] env[67131]: DEBUG nova.compute.utils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.179389] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 948.179557] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 948.179716] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 948.179976] env[67131]: DEBUG nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 948.180057] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.203286] env[67131]: DEBUG nova.network.neutron [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.211373] env[67131]: INFO nova.compute.manager [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c] Took 0.03 seconds to deallocate network for instance. [ 948.251845] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ba6dea0d-b827-405f-b0bf-3ce881dddc80 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "fd5bdebf-fc65-47e3-8a3e-b0c196c9f90c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 339.970s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.260533] env[67131]: DEBUG nova.compute.manager [None req-5f440b03-70d1-47a0-ab9f-8d31fbe9aaf0 tempest-InstanceActionsNegativeTestJSON-524430231 tempest-InstanceActionsNegativeTestJSON-524430231-project-member] [instance: b1b04cd3-c691-4689-a7c4-d97798668092] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 948.282409] env[67131]: DEBUG nova.compute.manager [None req-5f440b03-70d1-47a0-ab9f-8d31fbe9aaf0 tempest-InstanceActionsNegativeTestJSON-524430231 tempest-InstanceActionsNegativeTestJSON-524430231-project-member] [instance: b1b04cd3-c691-4689-a7c4-d97798668092] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 948.302718] env[67131]: DEBUG oslo_concurrency.lockutils [None req-5f440b03-70d1-47a0-ab9f-8d31fbe9aaf0 tempest-InstanceActionsNegativeTestJSON-524430231 tempest-InstanceActionsNegativeTestJSON-524430231-project-member] Lock "b1b04cd3-c691-4689-a7c4-d97798668092" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.737s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.305435] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.306983] env[67131]: ERROR nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] result = getattr(controller, method)(*args, **kwargs) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._get(image_id) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] resp, body = self.http_client.get(url, headers=header) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.request(url, 'GET', **kwargs) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._handle_response(resp) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exc.from_response(resp, resp.content) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] During handling of the above exception, another exception occurred: [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] yield resources [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.driver.spawn(context, instance, image_meta, [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._fetch_image_if_missing(context, vi) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image_fetch(context, vi, tmp_image_ds_loc) [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] images.fetch_image( [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 948.306983] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] metadata = IMAGE_API.get(context, image_ref) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return session.show(context, image_id, [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] _reraise_translated_image_exception(image_id) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise new_exc.with_traceback(exc_trace) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] result = getattr(controller, method)(*args, **kwargs) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._get(image_id) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] resp, body = self.http_client.get(url, headers=header) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.request(url, 'GET', **kwargs) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._handle_response(resp) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exc.from_response(resp, resp.content) [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 948.308021] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 948.308021] env[67131]: INFO nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Terminating instance [ 948.308625] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.308755] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.309335] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 948.309519] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.309928] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f57dce7f-50c1-4e61-81b6-33b3322696df {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.312481] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca132ff7-d484-4606-87ed-f2d886695271 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.315860] env[67131]: DEBUG nova.compute.manager [None req-1ff5c02f-bcca-4e5a-a77f-044999f529c7 tempest-ServerActionsV293TestJSON-563675839 tempest-ServerActionsV293TestJSON-563675839-project-member] [instance: 12dd51ad-bb48-4166-a208-0c8f6dd044fe] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 948.322560] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 948.322745] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-993a26bb-908d-467e-8c6a-c5d45f49bfa2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.324960] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.325143] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 948.326087] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fd99e7d2-173b-4662-bffb-62d93e48931f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.332283] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for the task: (returnval){ [ 948.332283] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52f23a12-0565-9803-660a-b379bbe42384" [ 948.332283] env[67131]: _type = "Task" [ 948.332283] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.339647] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52f23a12-0565-9803-660a-b379bbe42384, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.340014] env[67131]: DEBUG nova.compute.manager [None req-1ff5c02f-bcca-4e5a-a77f-044999f529c7 tempest-ServerActionsV293TestJSON-563675839 tempest-ServerActionsV293TestJSON-563675839-project-member] [instance: 12dd51ad-bb48-4166-a208-0c8f6dd044fe] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 948.363316] env[67131]: DEBUG oslo_concurrency.lockutils [None req-1ff5c02f-bcca-4e5a-a77f-044999f529c7 tempest-ServerActionsV293TestJSON-563675839 tempest-ServerActionsV293TestJSON-563675839-project-member] Lock "12dd51ad-bb48-4166-a208-0c8f6dd044fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.925s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.375361] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 948.389685] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 948.389896] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 948.390087] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Deleting the datastore file [datastore1] b47e3b03-7b84-4305-a55c-577401e5acf3 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 948.390339] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eda45e93-91ce-402d-b4ef-294490ed6a08 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.397839] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Waiting for the task: (returnval){ [ 948.397839] env[67131]: value = "task-3456483" [ 948.397839] env[67131]: _type = "Task" [ 948.397839] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.405235] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Task: {'id': task-3456483, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.435160] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.435326] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.436624] env[67131]: INFO nova.compute.claims [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 948.601074] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96cad9b0-15fa-4c83-bd79-8e408d31f0ea {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.608424] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8483f430-6550-4e41-a6ad-4af2e96389fa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.636918] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44cdc130-46ae-42e0-b33f-b3e9e8f9aad4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.643681] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96507818-2b8c-48cd-b131-731ddffb53bd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.658219] env[67131]: DEBUG nova.compute.provider_tree [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 948.669617] env[67131]: DEBUG nova.scheduler.client.report [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 948.683089] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.683565] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 948.715238] env[67131]: DEBUG nova.compute.utils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 948.717301] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 948.717499] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 948.725310] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 948.790111] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 948.811595] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 948.811931] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 948.811991] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 948.812174] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 948.812320] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 948.812507] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 948.812793] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 948.812882] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 948.813192] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 948.813405] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 948.813602] env[67131]: DEBUG nova.virt.hardware [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 948.814685] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d946b2-eb1b-47b9-a402-25dc3167bfb9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.823720] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef74bb22-d927-44b7-8ab0-a057147fb4a4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.839636] env[67131]: DEBUG nova.policy [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0e4b1d02497b41958ef3cad6559d17ec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69f3b2f337df4724b66c563c996ed8bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 948.848595] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 948.848835] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Creating directory with path [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.849069] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98888c81-e12d-43f7-a436-1967f670d39b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.860557] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Created directory with path [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.860754] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Fetch image to [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 948.860923] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 948.862092] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efe91043-4cfb-4703-8a2e-bd38ce4bf75c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.870424] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001a5fae-6e6c-4768-a369-070094745353 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.878911] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535d7819-c875-406a-b973-8a0b2b85ed73 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.914010] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a4b89f-3ad8-495c-8dd2-2b05f843f8ab {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.921821] env[67131]: DEBUG oslo_vmware.api [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Task: {'id': task-3456483, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087016} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.927013] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.927013] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.927013] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.927013] env[67131]: INFO nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 948.927013] env[67131]: DEBUG nova.compute.claims [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 948.927013] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.927013] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.928888] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d40c38d7-64c3-4a6d-950b-83ab4cdf384c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.951516] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.952228] env[67131]: DEBUG nova.compute.utils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance b47e3b03-7b84-4305-a55c-577401e5acf3 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.953729] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 948.953888] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 948.954060] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 948.954250] env[67131]: DEBUG nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 948.954475] env[67131]: DEBUG nova.network.neutron [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.958509] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 949.008217] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 949.064996] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 949.065198] env[67131]: DEBUG oslo_vmware.rw_handles [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 949.115649] env[67131]: DEBUG neutronclient.v2_0.client [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 949.119721] env[67131]: ERROR nova.compute.manager [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] result = getattr(controller, method)(*args, **kwargs) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._get(image_id) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] resp, body = self.http_client.get(url, headers=header) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.request(url, 'GET', **kwargs) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._handle_response(resp) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exc.from_response(resp, resp.content) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] During handling of the above exception, another exception occurred: [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.driver.spawn(context, instance, image_meta, [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._fetch_image_if_missing(context, vi) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image_fetch(context, vi, tmp_image_ds_loc) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] images.fetch_image( [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] metadata = IMAGE_API.get(context, image_ref) [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return session.show(context, image_id, [ 949.119721] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] _reraise_translated_image_exception(image_id) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise new_exc.with_traceback(exc_trace) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] result = getattr(controller, method)(*args, **kwargs) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._get(image_id) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] resp, body = self.http_client.get(url, headers=header) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.request(url, 'GET', **kwargs) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._handle_response(resp) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exc.from_response(resp, resp.content) [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] During handling of the above exception, another exception occurred: [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._build_and_run_instance(context, instance, image, [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] with excutils.save_and_reraise_exception(): [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.force_reraise() [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise self.value [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] with self.rt.instance_claim(context, instance, node, allocs, [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.abort() [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 949.120814] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return f(*args, **kwargs) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._unset_instance_host_and_node(instance) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] instance.save() [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] updates, result = self.indirection_api.object_action( [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return cctxt.call(context, 'object_action', objinst=objinst, [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] result = self.transport._send( [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._driver.send(target, ctxt, message, [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise result [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] nova.exception_Remote.InstanceNotFound_Remote: Instance b47e3b03-7b84-4305-a55c-577401e5acf3 could not be found. [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return getattr(target, method)(*args, **kwargs) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return fn(self, *args, **kwargs) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] old_ref, inst_ref = db.instance_update_and_get_original( [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return f(*args, **kwargs) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] with excutils.save_and_reraise_exception() as ectxt: [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.force_reraise() [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise self.value [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return f(*args, **kwargs) [ 949.122223] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return f(context, *args, **kwargs) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exception.InstanceNotFound(instance_id=uuid) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] nova.exception.InstanceNotFound: Instance b47e3b03-7b84-4305-a55c-577401e5acf3 could not be found. [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] During handling of the above exception, another exception occurred: [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] exception_handler_v20(status_code, error_body) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise client_exc(message=error_message, [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Neutron server returns request_ids: ['req-f492b005-c919-4ab1-a952-afbebbab5c6e'] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] During handling of the above exception, another exception occurred: [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] Traceback (most recent call last): [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._deallocate_network(context, instance, requested_networks) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self.network_api.deallocate_for_instance( [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] data = neutron.list_ports(**search_opts) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.list('ports', self.ports_path, retrieve_all, [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 949.123876] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] for r in self._pagination(collection, path, **params): [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] res = self.get(path, params=params) [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.retry_request("GET", action, body=body, [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] return self.do_request(method, action, body=body, [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] ret = obj(*args, **kwargs) [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] self._handle_fault_response(status_code, replybody, resp) [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] raise exception.Unauthorized() [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] nova.exception.Unauthorized: Not authorized. [ 949.125264] env[67131]: ERROR nova.compute.manager [instance: b47e3b03-7b84-4305-a55c-577401e5acf3] [ 949.143404] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d95f7af6-48d1-4b93-8c7f-14b06cff085b tempest-ServersAdminNegativeTestJSON-1853394636 tempest-ServersAdminNegativeTestJSON-1853394636-project-member] Lock "b47e3b03-7b84-4305-a55c-577401e5acf3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 340.601s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.152576] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 949.199122] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.199380] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.200907] env[67131]: INFO nova.compute.claims [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 949.271929] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Successfully created port: 3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 949.390297] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d370032-ef47-46ce-b239-db3795f6cc36 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.398405] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa1341f1-8417-4dc7-8fbf-97bd30c112a0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.428881] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca2b96f9-3c0a-4810-9c8d-d31e88797035 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.436023] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c13d50a-1392-4bb3-88eb-181e81252584 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.448829] env[67131]: DEBUG nova.compute.provider_tree [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.456622] env[67131]: DEBUG nova.scheduler.client.report [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.471012] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.471485] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 949.506445] env[67131]: DEBUG nova.compute.utils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 949.507849] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 949.507931] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 949.515793] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 949.568561] env[67131]: DEBUG nova.policy [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bc4206d138904bdd9d60560632f1ef37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29932003e4164dbf99b32c2bfffe7dbf', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 949.576757] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 949.602068] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 949.602311] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 949.602468] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 949.602745] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 949.602804] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 949.602934] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 949.603148] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 949.603307] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 949.603469] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 949.603625] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 949.603788] env[67131]: DEBUG nova.virt.hardware [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 949.604634] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b258bfb-2829-4efe-b7e3-d8b9540f4945 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.612399] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf338a7-315e-4521-9361-13bde0cb8819 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.278546] env[67131]: DEBUG nova.compute.manager [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Received event network-vif-plugged-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.278797] env[67131]: DEBUG oslo_concurrency.lockutils [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] Acquiring lock "765e5c4e-c893-41d2-9087-43294f24f5c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.278963] env[67131]: DEBUG oslo_concurrency.lockutils [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] Lock "765e5c4e-c893-41d2-9087-43294f24f5c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.279141] env[67131]: DEBUG oslo_concurrency.lockutils [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] Lock "765e5c4e-c893-41d2-9087-43294f24f5c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.279357] env[67131]: DEBUG nova.compute.manager [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] No waiting events found dispatching network-vif-plugged-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 950.279466] env[67131]: WARNING nova.compute.manager [req-f8756a9d-c47e-4126-9360-96bf6587774f req-be3b9027-8e33-4d5e-beda-c5eafe4daf1e service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Received unexpected event network-vif-plugged-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 for instance with vm_state building and task_state spawning. [ 950.429661] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Successfully updated port: 3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.443024] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.443024] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.443024] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 950.481846] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Successfully created port: 5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 950.495973] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 950.747751] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Updating instance_info_cache with network_info: [{"id": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "address": "fa:16:3e:44:1e:f9", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.241", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ae6f6d2-5e", "ovs_interfaceid": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.760016] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.760302] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance network_info: |[{"id": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "address": "fa:16:3e:44:1e:f9", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.241", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ae6f6d2-5e", "ovs_interfaceid": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 950.760678] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:44:1e:f9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4cb37d4-2060-48b6-9e60-156a71fc7ee3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.768102] env[67131]: DEBUG oslo.service.loopingcall [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 950.768510] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.768730] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-df657741-919f-41cc-bab1-fe0ba5e5cdf8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.789713] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.789713] env[67131]: value = "task-3456484" [ 950.789713] env[67131]: _type = "Task" [ 950.789713] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.796995] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456484, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.245305] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Successfully updated port: 5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 951.255108] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.255258] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquired lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.255410] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 951.299589] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456484, 'name': CreateVM_Task, 'duration_secs': 0.294458} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 951.299882] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.300369] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.300592] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.300821] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 951.301053] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ea9f277d-2d7b-4b9a-83d7-237d60cc0961 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.305558] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 951.305558] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52d99ea8-2001-9de8-e61a-f91ad9fd44a4" [ 951.305558] env[67131]: _type = "Task" [ 951.305558] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.313598] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52d99ea8-2001-9de8-e61a-f91ad9fd44a4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.329070] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 951.740879] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Updating instance_info_cache with network_info: [{"id": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "address": "fa:16:3e:c5:4c:c3", "network": {"id": "bc831a22-de5c-415e-843d-fbbe9b4f2cfb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1582826414-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29932003e4164dbf99b32c2bfffe7dbf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5107eb24-3c", "ovs_interfaceid": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.751053] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Releasing lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.751287] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance network_info: |[{"id": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "address": "fa:16:3e:c5:4c:c3", "network": {"id": "bc831a22-de5c-415e-843d-fbbe9b4f2cfb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1582826414-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29932003e4164dbf99b32c2bfffe7dbf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5107eb24-3c", "ovs_interfaceid": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 951.751635] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c5:4c:c3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5107eb24-3ca5-41a7-989b-d6c99660cf97', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 951.758872] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Creating folder: Project (29932003e4164dbf99b32c2bfffe7dbf). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 951.759326] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-deb2bad8-5a93-4c6d-949e-b8b944dea6f9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.769407] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Created folder: Project (29932003e4164dbf99b32c2bfffe7dbf) in parent group-v690228. [ 951.769586] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Creating folder: Instances. Parent ref: group-v690289. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 951.769791] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-774eee75-0d1d-4f70-8392-e19a4b407e08 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.777702] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Created folder: Instances in parent group-v690289. [ 951.777909] env[67131]: DEBUG oslo.service.loopingcall [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 951.778084] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 951.778259] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f7c6e151-d0e7-4e6b-b853-eeace2750b2d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.796105] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 951.796105] env[67131]: value = "task-3456487" [ 951.796105] env[67131]: _type = "Task" [ 951.796105] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.803032] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456487, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.813298] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.813528] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.813730] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.306459] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Received event network-changed-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 952.306685] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Refreshing instance network info cache due to event network-changed-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 952.306856] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Acquiring lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.306996] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Acquired lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.307351] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Refreshing network info cache for port 3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 952.312172] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456487, 'name': CreateVM_Task, 'duration_secs': 0.267027} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 952.312551] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 952.313164] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.313320] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.313654] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 952.313873] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6655655c-b72e-4c62-b80b-2f28888cc462 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.321461] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Waiting for the task: (returnval){ [ 952.321461] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5245863e-7260-cfe5-d18e-ab3e15abd41a" [ 952.321461] env[67131]: _type = "Task" [ 952.321461] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.330684] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5245863e-7260-cfe5-d18e-ab3e15abd41a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.697636] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Updated VIF entry in instance network info cache for port 3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 952.698100] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Updating instance_info_cache with network_info: [{"id": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "address": "fa:16:3e:44:1e:f9", "network": {"id": "461d4381-d986-4f8b-a934-086c906e8e3d", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.241", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b748d6760ea143ada1c17aae946fd343", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4cb37d4-2060-48b6-9e60-156a71fc7ee3", "external-id": "nsx-vlan-transportzone-819", "segmentation_id": 819, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ae6f6d2-5e", "ovs_interfaceid": "3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.708908] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Releasing lock "refresh_cache-765e5c4e-c893-41d2-9087-43294f24f5c3" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 952.709237] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Received event network-vif-plugged-5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 952.709474] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Acquiring lock "2778d965-ad71-4239-b03a-214cd11b08ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 952.709720] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Lock "2778d965-ad71-4239-b03a-214cd11b08ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 952.709924] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Lock "2778d965-ad71-4239-b03a-214cd11b08ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 952.710149] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] No waiting events found dispatching network-vif-plugged-5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 952.710537] env[67131]: WARNING nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Received unexpected event network-vif-plugged-5107eb24-3ca5-41a7-989b-d6c99660cf97 for instance with vm_state building and task_state spawning. [ 952.710768] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Received event network-changed-5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 952.710972] env[67131]: DEBUG nova.compute.manager [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Refreshing instance network info cache due to event network-changed-5107eb24-3ca5-41a7-989b-d6c99660cf97. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 952.711212] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Acquiring lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.711399] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Acquired lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.711705] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Refreshing network info cache for port 5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 952.832913] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 952.833600] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 952.833600] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 953.036982] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Updated VIF entry in instance network info cache for port 5107eb24-3ca5-41a7-989b-d6c99660cf97. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 953.037399] env[67131]: DEBUG nova.network.neutron [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Updating instance_info_cache with network_info: [{"id": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "address": "fa:16:3e:c5:4c:c3", "network": {"id": "bc831a22-de5c-415e-843d-fbbe9b4f2cfb", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1582826414-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29932003e4164dbf99b32c2bfffe7dbf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "44ed8f45-cb8e-40e7-ac70-a7f386a7d2c2", "external-id": "nsx-vlan-transportzone-268", "segmentation_id": 268, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5107eb24-3c", "ovs_interfaceid": "5107eb24-3ca5-41a7-989b-d6c99660cf97", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 953.047808] env[67131]: DEBUG oslo_concurrency.lockutils [req-e259c97f-d790-43cc-809d-4732fabee6f7 req-ff0c490c-8d33-42e7-9bad-8d50fc39cfb0 service nova] Releasing lock "refresh_cache-2778d965-ad71-4239-b03a-214cd11b08ed" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 982.216068] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 982.216395] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 982.216475] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 983.211904] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 983.215476] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.215508] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.215760] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 984.215806] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 984.230328] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.230475] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.230605] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.230752] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.230889] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.231032] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 985.215421] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.215663] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.225604] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 985.225808] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 985.225971] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 985.226136] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 985.227318] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ca1056d-8583-4db5-a07c-b9cbe3a7b6c5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.235839] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13537ebe-cadc-478f-8224-f14c9c945c0a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.249291] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be6152c6-ce52-4edf-8183-495140872b4a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.255435] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3057b857-97c1-4ea1-b562-ca4167a8ee48 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.284168] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180883MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 985.284327] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 985.284519] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 985.334760] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 985.334946] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 985.335887] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 985.335887] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 765e5c4e-c893-41d2-9087-43294f24f5c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 985.335887] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2778d965-ad71-4239-b03a-214cd11b08ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 985.346888] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 985.357888] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance e0efe841-2ea3-4da4-973d-984dc5029baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 985.367667] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2fd6ec26-9e42-43fd-a09c-de43a8107aee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 985.378021] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 985.389135] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 985.389135] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 985.389135] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 985.511033] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a0862d3-2a41-46e2-912e-b2640bb54fa8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.518430] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aac14f83-c1a4-4fb2-8023-42884ac7d2af {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.549073] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f288140-f0d4-4ea4-ae28-f6715de0be39 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.556172] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fadda3c-3369-4c99-898d-5b990f2c68e3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.568777] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 985.576926] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 985.589757] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 985.589989] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.305s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.590331] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 987.590697] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.178024] env[67131]: WARNING oslo_vmware.rw_handles [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 995.178024] env[67131]: ERROR oslo_vmware.rw_handles [ 995.178024] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 995.179283] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 995.179734] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Copying Virtual Disk [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/8b671220-c10e-4499-b192-85a8598e9d67/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 995.180196] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b35193eb-59df-4e26-b065-670c3581741f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.188047] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for the task: (returnval){ [ 995.188047] env[67131]: value = "task-3456488" [ 995.188047] env[67131]: _type = "Task" [ 995.188047] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 995.197051] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Task: {'id': task-3456488, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.697679] env[67131]: DEBUG oslo_vmware.exceptions [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 995.697941] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 995.698517] env[67131]: ERROR nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.698517] env[67131]: Faults: ['InvalidArgument'] [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Traceback (most recent call last): [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] yield resources [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self.driver.spawn(context, instance, image_meta, [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self._fetch_image_if_missing(context, vi) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] image_cache(vi, tmp_image_ds_loc) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] vm_util.copy_virtual_disk( [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] session._wait_for_task(vmdk_copy_task) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return self.wait_for_task(task_ref) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return evt.wait() [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] result = hub.switch() [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return self.greenlet.switch() [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self.f(*self.args, **self.kw) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] raise exceptions.translate_fault(task_info.error) [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Faults: ['InvalidArgument'] [ 995.698517] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] [ 995.699342] env[67131]: INFO nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Terminating instance [ 995.700406] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 995.700614] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 995.700869] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3cf57614-a0ef-45c7-84ea-7cdb39e0c206 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.703237] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 995.703427] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 995.704169] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f467bac2-4548-456c-8cb1-d43e951fb6bf {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.711272] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 995.711495] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-901a465f-14c2-4b1f-875c-ccfee4aaa6be {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.713661] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 995.713857] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 995.714858] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8f1cc075-e8f4-4b83-8b0a-231334971660 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.719638] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for the task: (returnval){ [ 995.719638] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52002b34-62c3-cb81-86ce-313e22b46360" [ 995.719638] env[67131]: _type = "Task" [ 995.719638] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 995.748154] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52002b34-62c3-cb81-86ce-313e22b46360, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.776665] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 995.776883] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 995.777077] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Deleting the datastore file [datastore1] 39b67ef8-fce0-4bf3-b161-b5fbd588214b {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 995.777340] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a58a82ce-c79d-401f-9894-e0a6ab675261 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.783779] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for the task: (returnval){ [ 995.783779] env[67131]: value = "task-3456490" [ 995.783779] env[67131]: _type = "Task" [ 995.783779] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 995.790979] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Task: {'id': task-3456490, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 996.229851] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 996.230224] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Creating directory with path [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 996.230276] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ffa16f3-56fc-45e0-b4f9-4384851eb3ce {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.242062] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Created directory with path [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 996.242253] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Fetch image to [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 996.242424] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 996.243151] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84b00ae9-2aa5-410c-84ef-3e3c0edbfb94 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.249579] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b61637-ebc3-4a17-a983-b7079e0c079a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.258240] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9ef6d97-6ecb-4bc6-9974-fc380402f217 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.290216] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c5dca35-1962-4fef-bde6-1ee7d1dab8e6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.297773] env[67131]: DEBUG oslo_vmware.api [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Task: {'id': task-3456490, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088104} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 996.299150] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 996.299341] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 996.299511] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 996.299683] env[67131]: INFO nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 996.301383] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aeb1b4bf-02d7-4e75-a1d4-00e8503959a6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.303268] env[67131]: DEBUG nova.compute.claims [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 996.303438] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.303642] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.327093] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 996.372730] env[67131]: DEBUG oslo_vmware.rw_handles [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 996.431543] env[67131]: DEBUG oslo_vmware.rw_handles [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 996.431722] env[67131]: DEBUG oslo_vmware.rw_handles [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 996.501654] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74601a17-18ca-4cda-9784-d5e5a5b83fa2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.509200] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b12574-0aef-4ad7-97cc-86f26b32b36a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.537990] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dfdc6b5-bfa4-4f2e-bdb3-52e78fb45a9f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.544453] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32d76be4-9679-4ae7-b598-988d919519fd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.556911] env[67131]: DEBUG nova.compute.provider_tree [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 996.564964] env[67131]: DEBUG nova.scheduler.client.report [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 996.577350] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.577898] env[67131]: ERROR nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 996.577898] env[67131]: Faults: ['InvalidArgument'] [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Traceback (most recent call last): [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self.driver.spawn(context, instance, image_meta, [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self._fetch_image_if_missing(context, vi) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] image_cache(vi, tmp_image_ds_loc) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] vm_util.copy_virtual_disk( [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] session._wait_for_task(vmdk_copy_task) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return self.wait_for_task(task_ref) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return evt.wait() [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] result = hub.switch() [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] return self.greenlet.switch() [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] self.f(*self.args, **self.kw) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] raise exceptions.translate_fault(task_info.error) [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Faults: ['InvalidArgument'] [ 996.577898] env[67131]: ERROR nova.compute.manager [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] [ 996.578741] env[67131]: DEBUG nova.compute.utils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 996.579859] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Build of instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b was re-scheduled: A specified parameter was not correct: fileType [ 996.579859] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 996.580239] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 996.580407] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 996.580559] env[67131]: DEBUG nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 996.580721] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.797955] env[67131]: DEBUG nova.network.neutron [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.812893] env[67131]: INFO nova.compute.manager [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Took 0.23 seconds to deallocate network for instance. [ 996.900697] env[67131]: INFO nova.scheduler.client.report [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Deleted allocations for instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b [ 996.920297] env[67131]: DEBUG oslo_concurrency.lockutils [None req-9ea82a5b-c414-4ade-aa9a-a93292734abf tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 386.036s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.921489] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 188.552s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.921721] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Acquiring lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.921920] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.922097] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.924143] env[67131]: INFO nova.compute.manager [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Terminating instance [ 996.926985] env[67131]: DEBUG nova.compute.manager [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 996.927136] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 996.927458] env[67131]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5bc7cb9a-1ff9-4ac8-9066-aa9937fff417 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.936735] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d78080e-8980-4362-9ad4-14257227aaa0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.948419] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 996.969844] env[67131]: WARNING nova.virt.vmwareapi.vmops [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 39b67ef8-fce0-4bf3-b161-b5fbd588214b could not be found. [ 996.969844] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 996.969844] env[67131]: INFO nova.compute.manager [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 996.969844] env[67131]: DEBUG oslo.service.loopingcall [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 996.969844] env[67131]: DEBUG nova.compute.manager [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 996.969844] env[67131]: DEBUG nova.network.neutron [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.999134] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.999134] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.000593] env[67131]: INFO nova.compute.claims [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 997.012114] env[67131]: DEBUG nova.network.neutron [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 997.031697] env[67131]: INFO nova.compute.manager [-] [instance: 39b67ef8-fce0-4bf3-b161-b5fbd588214b] Took 0.06 seconds to deallocate network for instance. [ 997.154390] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7aa5a15c-b122-4017-bf02-5c85a765aa17 tempest-MigrationsAdminTest-1703286375 tempest-MigrationsAdminTest-1703286375-project-member] Lock "39b67ef8-fce0-4bf3-b161-b5fbd588214b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.233s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.180884] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd8bdb5-5484-4296-9dfe-2a978f8565f1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.188885] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7765d9ee-b343-4069-b449-dbf5ca9be26f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.220719] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff54997-68ab-45a1-a7c1-35d6150bd21e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.227735] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dbe9f1f-9e87-4141-95ff-6529aa244937 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.241191] env[67131]: DEBUG nova.compute.provider_tree [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 997.248938] env[67131]: DEBUG nova.scheduler.client.report [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 997.263040] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.263040] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 997.292186] env[67131]: DEBUG nova.compute.utils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 997.293651] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 997.293819] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 997.301690] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 997.347517] env[67131]: DEBUG nova.policy [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e005632940a745888e7874764998b27a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f1d6d6b17bbe4417a594852fb64383de', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 997.359055] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 997.379965] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 997.380199] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 997.380366] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 997.380543] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 997.380787] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 997.380966] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 997.381191] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 997.381349] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 997.381516] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 997.381674] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 997.381839] env[67131]: DEBUG nova.virt.hardware [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 997.382665] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72859b79-f053-4384-ae5a-36fc8f4e74b8 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.390334] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1073feb-91dc-49f9-9797-191e46ce4118 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.607591] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Successfully created port: 603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 998.543290] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Successfully updated port: 603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 998.554653] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 998.554879] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquired lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 998.555099] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 998.634380] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 998.872615] env[67131]: DEBUG nova.compute.manager [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Received event network-vif-plugged-603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 998.873091] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Acquiring lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 998.873167] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.873334] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.873494] env[67131]: DEBUG nova.compute.manager [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] No waiting events found dispatching network-vif-plugged-603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 998.873654] env[67131]: WARNING nova.compute.manager [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Received unexpected event network-vif-plugged-603cde36-ddd9-4ece-9b38-2431a2101c5d for instance with vm_state building and task_state spawning. [ 998.873809] env[67131]: DEBUG nova.compute.manager [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Received event network-changed-603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 998.873958] env[67131]: DEBUG nova.compute.manager [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Refreshing instance network info cache due to event network-changed-603cde36-ddd9-4ece-9b38-2431a2101c5d. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 998.874137] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Acquiring lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 998.877861] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Updating instance_info_cache with network_info: [{"id": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "address": "fa:16:3e:fb:ab:3f", "network": {"id": "6fc4a85d-42b3-41fb-930d-a7049e7f95ef", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1145530956-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d6d6b17bbe4417a594852fb64383de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap603cde36-dd", "ovs_interfaceid": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 998.887934] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Releasing lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 998.888104] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance network_info: |[{"id": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "address": "fa:16:3e:fb:ab:3f", "network": {"id": "6fc4a85d-42b3-41fb-930d-a7049e7f95ef", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1145530956-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d6d6b17bbe4417a594852fb64383de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap603cde36-dd", "ovs_interfaceid": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 998.888359] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Acquired lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 998.888536] env[67131]: DEBUG nova.network.neutron [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Refreshing network info cache for port 603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 998.889515] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fb:ab:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '089a7624-43ba-4fce-bfc0-63e4bb7f9aeb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '603cde36-ddd9-4ece-9b38-2431a2101c5d', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 998.896882] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Creating folder: Project (f1d6d6b17bbe4417a594852fb64383de). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 998.897838] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5f0f8b3-0e73-430c-a747-36f424ea93bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.912226] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Created folder: Project (f1d6d6b17bbe4417a594852fb64383de) in parent group-v690228. [ 998.912405] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Creating folder: Instances. Parent ref: group-v690292. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 998.912607] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-edfb972a-8ef1-4fb8-bd1f-afb2ef8d43d6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.920812] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Created folder: Instances in parent group-v690292. [ 998.921060] env[67131]: DEBUG oslo.service.loopingcall [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 998.921188] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 998.921394] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-245f27ff-a941-4d1b-b7dd-bc0c5a094a80 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.943191] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 998.943191] env[67131]: value = "task-3456493" [ 998.943191] env[67131]: _type = "Task" [ 998.943191] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 998.950654] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456493, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 999.266191] env[67131]: DEBUG nova.network.neutron [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Updated VIF entry in instance network info cache for port 603cde36-ddd9-4ece-9b38-2431a2101c5d. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 999.266574] env[67131]: DEBUG nova.network.neutron [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Updating instance_info_cache with network_info: [{"id": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "address": "fa:16:3e:fb:ab:3f", "network": {"id": "6fc4a85d-42b3-41fb-930d-a7049e7f95ef", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1145530956-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f1d6d6b17bbe4417a594852fb64383de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap603cde36-dd", "ovs_interfaceid": "603cde36-ddd9-4ece-9b38-2431a2101c5d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.275554] env[67131]: DEBUG oslo_concurrency.lockutils [req-a2e512c1-85f6-4a32-b49a-e03f392eeb30 req-984148ae-b3cb-4691-a1a7-e5b57f038562 service nova] Releasing lock "refresh_cache-52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 999.452897] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456493, 'name': CreateVM_Task, 'duration_secs': 0.301985} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 999.453074] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 999.453649] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 999.453824] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 999.454167] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 999.454391] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91a20863-fb61-4621-8c76-88130a4d5226 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.458659] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Waiting for the task: (returnval){ [ 999.458659] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]526a9dd8-475d-cc26-2824-2be2af2977fc" [ 999.458659] env[67131]: _type = "Task" [ 999.458659] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 999.465749] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]526a9dd8-475d-cc26-2824-2be2af2977fc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 999.969086] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 999.969350] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 999.969551] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1005.160902] env[67131]: DEBUG nova.compute.manager [req-25801c99-f7cf-44cc-84c0-756bd81c8280 req-fd068442-5209-4dd9-9357-e8ad2fe4ad16 service nova] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Received event network-vif-deleted-3ae6f6d2-5ef5-4935-a7d4-5378bb8b5939 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1012.633127] env[67131]: DEBUG nova.compute.manager [req-063f725c-debd-45f3-88e6-3812cbb9d381 req-1bdae161-34c9-4476-ab89-f50906098e16 service nova] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Received event network-vif-deleted-5107eb24-3ca5-41a7-989b-d6c99660cf97 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1014.927929] env[67131]: DEBUG nova.compute.manager [req-f8123f8f-4f65-46f3-a301-0e31f4fa56a1 req-8b91b86e-c952-4890-bbf6-e58fd227d6b3 service nova] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Received event network-vif-deleted-603cde36-ddd9-4ece-9b38-2431a2101c5d {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1042.056538] env[67131]: WARNING oslo_vmware.rw_handles [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1042.056538] env[67131]: ERROR oslo_vmware.rw_handles [ 1042.057472] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1042.058892] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1042.059151] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Copying Virtual Disk [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/6904251c-024a-425d-8a8f-34d1b2214e6c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1042.059465] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ebf1a999-942c-4747-94c6-778c31166c02 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.067695] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for the task: (returnval){ [ 1042.067695] env[67131]: value = "task-3456494" [ 1042.067695] env[67131]: _type = "Task" [ 1042.067695] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.075445] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Task: {'id': task-3456494, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.211229] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1042.224029] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1042.224178] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1042.578079] env[67131]: DEBUG oslo_vmware.exceptions [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1042.578210] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1042.578710] env[67131]: ERROR nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.578710] env[67131]: Faults: ['InvalidArgument'] [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Traceback (most recent call last): [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] yield resources [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self.driver.spawn(context, instance, image_meta, [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self._fetch_image_if_missing(context, vi) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] image_cache(vi, tmp_image_ds_loc) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] vm_util.copy_virtual_disk( [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] session._wait_for_task(vmdk_copy_task) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return self.wait_for_task(task_ref) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return evt.wait() [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] result = hub.switch() [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return self.greenlet.switch() [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self.f(*self.args, **self.kw) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] raise exceptions.translate_fault(task_info.error) [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Faults: ['InvalidArgument'] [ 1042.578710] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] [ 1042.579914] env[67131]: INFO nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Terminating instance [ 1042.580578] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1042.580780] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1042.581009] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11bbfd71-8dfb-4ea6-9daa-6a16d75bce35 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.584052] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1042.584226] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1042.584955] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67cf285d-6fcf-4e57-b5f4-e61464b83299 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.591741] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1042.591946] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-aea28be6-3f9c-4de1-af6d-20ea56c5f821 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.595031] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1042.595031] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1042.595433] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2ff1540e-cce0-402a-ad71-fbbc4778329b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.600106] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for the task: (returnval){ [ 1042.600106] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5242e8ec-e981-27f7-b348-9e57d47adeb9" [ 1042.600106] env[67131]: _type = "Task" [ 1042.600106] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.606940] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5242e8ec-e981-27f7-b348-9e57d47adeb9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.658384] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1042.658658] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1042.658765] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Deleting the datastore file [datastore1] 28bf23c6-d36a-4822-9569-c825a7366ed4 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1042.659035] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-68db9f86-1ddc-4574-83e8-71dc3722e8eb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.664710] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for the task: (returnval){ [ 1042.664710] env[67131]: value = "task-3456496" [ 1042.664710] env[67131]: _type = "Task" [ 1042.664710] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.672316] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Task: {'id': task-3456496, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1043.110680] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1043.111119] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Creating directory with path [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1043.111219] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-85473d39-ecf3-458c-a843-65913ee3b020 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.123951] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Created directory with path [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1043.124152] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Fetch image to [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1043.124347] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1043.125018] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0bc012-7365-4e76-91ab-b149516a0fc1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.131292] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f5ee28-d181-481a-a9a4-db0aafee6df7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.140324] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59147030-b6d4-4f42-a963-30493b338d63 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.172144] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbe41ec0-f1cf-4424-9d3b-fef9c73960b0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.179841] env[67131]: DEBUG oslo_vmware.api [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Task: {'id': task-3456496, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082051} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1043.181257] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1043.181435] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1043.181599] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1043.181777] env[67131]: INFO nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1043.183504] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5f90bcf7-ab32-4963-b560-1fc4a4b5dc3f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.185318] env[67131]: DEBUG nova.compute.claims [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1043.185489] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.185693] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.207567] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1043.215849] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1043.254566] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1043.312239] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1043.312425] env[67131]: DEBUG oslo_vmware.rw_handles [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1043.341884] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f7b101f-258a-45bf-bb20-db0932c06dd2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.349545] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-845a0f38-7df6-49c0-917a-76526eeacffb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.378177] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f84522e-2896-4817-b4c9-f2e194811af4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.384801] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e213a39d-e9e1-4717-8637-b7895ce705dc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.397246] env[67131]: DEBUG nova.compute.provider_tree [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1043.405325] env[67131]: DEBUG nova.scheduler.client.report [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1043.419328] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.234s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.419834] env[67131]: ERROR nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1043.419834] env[67131]: Faults: ['InvalidArgument'] [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Traceback (most recent call last): [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self.driver.spawn(context, instance, image_meta, [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self._fetch_image_if_missing(context, vi) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] image_cache(vi, tmp_image_ds_loc) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] vm_util.copy_virtual_disk( [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] session._wait_for_task(vmdk_copy_task) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return self.wait_for_task(task_ref) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return evt.wait() [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] result = hub.switch() [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] return self.greenlet.switch() [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] self.f(*self.args, **self.kw) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] raise exceptions.translate_fault(task_info.error) [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Faults: ['InvalidArgument'] [ 1043.419834] env[67131]: ERROR nova.compute.manager [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] [ 1043.421181] env[67131]: DEBUG nova.compute.utils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1043.421830] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Build of instance 28bf23c6-d36a-4822-9569-c825a7366ed4 was re-scheduled: A specified parameter was not correct: fileType [ 1043.421830] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1043.422215] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1043.422388] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1043.422555] env[67131]: DEBUG nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1043.422713] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.686064] env[67131]: DEBUG nova.network.neutron [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.695889] env[67131]: INFO nova.compute.manager [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Took 0.27 seconds to deallocate network for instance. [ 1043.774388] env[67131]: INFO nova.scheduler.client.report [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Deleted allocations for instance 28bf23c6-d36a-4822-9569-c825a7366ed4 [ 1043.790365] env[67131]: DEBUG oslo_concurrency.lockutils [None req-d10dfeca-d017-49de-bb5f-47451ca688bc tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 418.935s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.791415] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 219.558s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.791631] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Acquiring lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.791827] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.791985] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.793851] env[67131]: INFO nova.compute.manager [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Terminating instance [ 1043.795432] env[67131]: DEBUG nova.compute.manager [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1043.795646] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1043.796089] env[67131]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d885fa7f-ed57-4dd2-9146-ad1188fac2ec {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.805286] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9faf0c6-3a09-48ad-afdc-bf164270068d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.816701] env[67131]: DEBUG nova.compute.manager [None req-8182dd17-09c3-4f1c-8c67-7b44c6b23b5c tempest-ServerAddressesNegativeTestJSON-2077313614 tempest-ServerAddressesNegativeTestJSON-2077313614-project-member] [instance: e0efe841-2ea3-4da4-973d-984dc5029baa] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1043.837736] env[67131]: WARNING nova.virt.vmwareapi.vmops [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 28bf23c6-d36a-4822-9569-c825a7366ed4 could not be found. [ 1043.837736] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1043.837917] env[67131]: INFO nova.compute.manager [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1043.838242] env[67131]: DEBUG oslo.service.loopingcall [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1043.838423] env[67131]: DEBUG nova.compute.manager [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1043.838525] env[67131]: DEBUG nova.network.neutron [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.841246] env[67131]: DEBUG nova.compute.manager [None req-8182dd17-09c3-4f1c-8c67-7b44c6b23b5c tempest-ServerAddressesNegativeTestJSON-2077313614 tempest-ServerAddressesNegativeTestJSON-2077313614-project-member] [instance: e0efe841-2ea3-4da4-973d-984dc5029baa] Instance disappeared before build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1043.860455] env[67131]: DEBUG nova.network.neutron [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.862102] env[67131]: DEBUG oslo_concurrency.lockutils [None req-8182dd17-09c3-4f1c-8c67-7b44c6b23b5c tempest-ServerAddressesNegativeTestJSON-2077313614 tempest-ServerAddressesNegativeTestJSON-2077313614-project-member] Lock "e0efe841-2ea3-4da4-973d-984dc5029baa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.937s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.867895] env[67131]: INFO nova.compute.manager [-] [instance: 28bf23c6-d36a-4822-9569-c825a7366ed4] Took 0.03 seconds to deallocate network for instance. [ 1043.872235] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1043.917103] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.917355] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.918827] env[67131]: INFO nova.compute.claims [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1043.968871] env[67131]: DEBUG oslo_concurrency.lockutils [None req-555ac4b3-e269-4072-acb6-066b94d25ab7 tempest-InstanceActionsV221TestJSON-1140337818 tempest-InstanceActionsV221TestJSON-1140337818-project-member] Lock "28bf23c6-d36a-4822-9569-c825a7366ed4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1044.028556] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ef7433d-d809-4cbc-adba-8da31a7dda2b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.036039] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-103a9ea0-0aa5-412a-ad1b-e8195f5e31fa {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.066059] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-836ed8b1-a8c1-492b-a929-3161eb4d798f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.072652] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae0fef2-f972-49bd-8f7e-202b59ea137d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.085433] env[67131]: DEBUG nova.compute.provider_tree [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1044.094009] env[67131]: DEBUG nova.scheduler.client.report [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1044.106404] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1044.106872] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1044.136290] env[67131]: DEBUG nova.compute.utils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1044.137621] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1044.137621] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1044.149046] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1044.191434] env[67131]: DEBUG nova.policy [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e46dd303203c4e23b8881daba253015b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '45e91ac7d6fc4c15ba72910fb44b6d8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 1044.208874] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1044.211216] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1044.215013] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1044.215163] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1044.215291] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1044.226215] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1044.226366] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1044.226494] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1044.228882] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1044.229118] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1044.229278] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1044.229458] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1044.229600] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1044.229743] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1044.229942] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1044.230112] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1044.230279] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1044.230439] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1044.230605] env[67131]: DEBUG nova.virt.hardware [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1044.231641] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-796c78bb-00b7-4875-9e6a-b6d5cf5e40ba {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.239977] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5266203f-3dfe-4ebf-818a-2766ab5b0820 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.470342] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Successfully created port: 52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1045.161076] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Successfully updated port: 52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1045.169432] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquiring lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1045.169621] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquired lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1045.169809] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1045.204740] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1045.215093] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.215514] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.352440] env[67131]: DEBUG nova.network.neutron [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Updating instance_info_cache with network_info: [{"id": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "address": "fa:16:3e:58:f8:d9", "network": {"id": "9a6b4228-063d-43db-a661-f437e8ad602e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2014239905-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45e91ac7d6fc4c15ba72910fb44b6d8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52989d3c-b2", "ovs_interfaceid": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.362807] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Releasing lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1045.363111] env[67131]: DEBUG nova.compute.manager [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Instance network_info: |[{"id": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "address": "fa:16:3e:58:f8:d9", "network": {"id": "9a6b4228-063d-43db-a661-f437e8ad602e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2014239905-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45e91ac7d6fc4c15ba72910fb44b6d8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52989d3c-b2", "ovs_interfaceid": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1045.363505] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:f8:d9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea4fe416-47a6-4542-b59d-8c71ab4d6503', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '52989d3c-b2b5-4dff-a7cf-b28c6a9f0080', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1045.370966] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Creating folder: Project (45e91ac7d6fc4c15ba72910fb44b6d8a). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1045.371448] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cc94a9ad-ee06-4cce-8fd6-01f9d88933ab {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.383049] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Created folder: Project (45e91ac7d6fc4c15ba72910fb44b6d8a) in parent group-v690228. [ 1045.383256] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Creating folder: Instances. Parent ref: group-v690295. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1045.383480] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5f867553-c4ee-49bb-9481-128b1faef55f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.392330] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Created folder: Instances in parent group-v690295. [ 1045.392546] env[67131]: DEBUG oslo.service.loopingcall [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1045.392718] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1045.392893] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8f0a0f84-cdfd-4df1-80db-93d6c633b207 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.411210] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1045.411210] env[67131]: value = "task-3456499" [ 1045.411210] env[67131]: _type = "Task" [ 1045.411210] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1045.418307] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456499, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1045.722107] env[67131]: DEBUG nova.compute.manager [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Received event network-vif-plugged-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1045.722399] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Acquiring lock "2fd6ec26-9e42-43fd-a09c-de43a8107aee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.722647] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Lock "2fd6ec26-9e42-43fd-a09c-de43a8107aee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.722806] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Lock "2fd6ec26-9e42-43fd-a09c-de43a8107aee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1045.722973] env[67131]: DEBUG nova.compute.manager [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] No waiting events found dispatching network-vif-plugged-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1045.723413] env[67131]: WARNING nova.compute.manager [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Received unexpected event network-vif-plugged-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 for instance with vm_state building and task_state spawning. [ 1045.723589] env[67131]: DEBUG nova.compute.manager [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Received event network-changed-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1045.723742] env[67131]: DEBUG nova.compute.manager [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Refreshing instance network info cache due to event network-changed-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1045.723922] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Acquiring lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1045.724068] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Acquired lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1045.724229] env[67131]: DEBUG nova.network.neutron [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Refreshing network info cache for port 52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1045.922945] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456499, 'name': CreateVM_Task, 'duration_secs': 0.296628} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1045.923129] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1045.923783] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1045.923945] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1045.924267] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1045.924540] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b015e09a-0956-4ee4-883d-9d1b0719199a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.928851] env[67131]: DEBUG oslo_vmware.api [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Waiting for the task: (returnval){ [ 1045.928851] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a93951-ddbc-cb25-4b75-729c96c5196d" [ 1045.928851] env[67131]: _type = "Task" [ 1045.928851] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1045.936360] env[67131]: DEBUG oslo_vmware.api [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a93951-ddbc-cb25-4b75-729c96c5196d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1045.960867] env[67131]: DEBUG nova.network.neutron [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Updated VIF entry in instance network info cache for port 52989d3c-b2b5-4dff-a7cf-b28c6a9f0080. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1045.961287] env[67131]: DEBUG nova.network.neutron [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Updating instance_info_cache with network_info: [{"id": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "address": "fa:16:3e:58:f8:d9", "network": {"id": "9a6b4228-063d-43db-a661-f437e8ad602e", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2014239905-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "45e91ac7d6fc4c15ba72910fb44b6d8a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap52989d3c-b2", "ovs_interfaceid": "52989d3c-b2b5-4dff-a7cf-b28c6a9f0080", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.970232] env[67131]: DEBUG oslo_concurrency.lockutils [req-bf417717-4c2c-4382-8684-1b8055ba927a req-b195e517-e9f0-4334-a4ea-d1ed533068b0 service nova] Releasing lock "refresh_cache-2fd6ec26-9e42-43fd-a09c-de43a8107aee" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1046.439571] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1046.439873] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1046.440020] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1047.215722] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1047.215940] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1047.225794] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1047.225992] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1047.226175] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1047.226331] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1047.227345] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d59ceed7-77d0-43d9-8f90-c442df7ff612 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.235679] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59937b3b-dc8d-4027-a02b-29e09d8445b2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.249400] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d314f625-792b-4e4e-b661-22401f2d3ebf {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.255350] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db46ef3d-c360-4095-89c4-52defbf232f6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.284850] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180884MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1047.285009] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1047.285208] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1047.323415] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1047.323571] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2fd6ec26-9e42-43fd-a09c-de43a8107aee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1047.333048] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1047.342933] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1047.343136] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1047.343308] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1047.402054] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c645d7-6372-4794-b112-645a5ce9e5f9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.410438] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b5bd3de-595f-475f-9e4e-29acae66da3a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.439334] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-530d9454-2a08-45f7-9b93-fad2d6df5a55 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.446197] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f631f3-ab3f-4cf4-968a-c7beed734093 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.460141] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1047.467959] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1047.480043] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1047.480306] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.479684] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1090.710522] env[67131]: WARNING oslo_vmware.rw_handles [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1090.710522] env[67131]: ERROR oslo_vmware.rw_handles [ 1090.711447] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1090.712766] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1090.713039] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Copying Virtual Disk [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/b9233dfe-5993-4411-b251-794eeeeb5e44/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1090.713310] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-56120bc5-9c47-462f-a975-0f9e6c36607e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1090.722285] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for the task: (returnval){ [ 1090.722285] env[67131]: value = "task-3456500" [ 1090.722285] env[67131]: _type = "Task" [ 1090.722285] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1090.729867] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Task: {'id': task-3456500, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.233108] env[67131]: DEBUG oslo_vmware.exceptions [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1091.233368] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1091.233944] env[67131]: ERROR nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1091.233944] env[67131]: Faults: ['InvalidArgument'] [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Traceback (most recent call last): [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] yield resources [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self.driver.spawn(context, instance, image_meta, [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self._fetch_image_if_missing(context, vi) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] image_cache(vi, tmp_image_ds_loc) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] vm_util.copy_virtual_disk( [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] session._wait_for_task(vmdk_copy_task) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return self.wait_for_task(task_ref) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return evt.wait() [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] result = hub.switch() [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return self.greenlet.switch() [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self.f(*self.args, **self.kw) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] raise exceptions.translate_fault(task_info.error) [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Faults: ['InvalidArgument'] [ 1091.233944] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] [ 1091.235258] env[67131]: INFO nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Terminating instance [ 1091.235892] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1091.236117] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1091.236348] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94c228b5-c856-4adb-a403-6214b9eabf36 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.238479] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1091.238676] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1091.239405] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d4f8c6b-d5ab-440c-8660-736ee82149cc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.246177] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1091.246408] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5bc1b206-8434-4270-bd0f-b94042ac1ee6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.248558] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1091.248703] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1091.249637] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-50820d53-6819-4f1e-a30a-b57bd2404de5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.254262] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Waiting for the task: (returnval){ [ 1091.254262] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a2d1e8-6d85-b537-3da9-713c86382edd" [ 1091.254262] env[67131]: _type = "Task" [ 1091.254262] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1091.265325] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a2d1e8-6d85-b537-3da9-713c86382edd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.325982] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1091.326220] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1091.326411] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Deleting the datastore file [datastore1] 7e46e878-1564-4f3b-baa5-5c99d7e04d80 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1091.326663] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-651b53a1-8fea-4c8a-a0fd-617a93b8eb07 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.333200] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for the task: (returnval){ [ 1091.333200] env[67131]: value = "task-3456502" [ 1091.333200] env[67131]: _type = "Task" [ 1091.333200] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1091.340992] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Task: {'id': task-3456502, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.764702] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1091.765056] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Creating directory with path [datastore1] vmware_temp/9244871f-425c-4047-825d-6029c15de8a2/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1091.765197] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5978027a-1013-4abd-bdce-81a03a4f9c04 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.777339] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Created directory with path [datastore1] vmware_temp/9244871f-425c-4047-825d-6029c15de8a2/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1091.777519] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Fetch image to [datastore1] vmware_temp/9244871f-425c-4047-825d-6029c15de8a2/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1091.777686] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/9244871f-425c-4047-825d-6029c15de8a2/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1091.778379] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-577936dc-ca78-4bf3-aa47-3a3616e6ca15 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.784622] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ed8232b-a074-4db3-87c4-801cbbfb4668 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.793280] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9cd7a6-e572-4dcf-894d-abb82b297fcd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.823993] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83ca7fe4-1762-4588-b8a7-ed4984abbf6d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.829514] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7739fee-5a18-4649-8487-e9c380061375 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.841594] env[67131]: DEBUG oslo_vmware.api [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Task: {'id': task-3456502, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08147} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1091.841804] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1091.841975] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1091.842159] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1091.842361] env[67131]: INFO nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1091.844391] env[67131]: DEBUG nova.compute.claims [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1091.844570] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1091.844785] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1091.913709] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1091.939546] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8634f5e8-0e72-49ab-8035-8806375ac1da {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.946870] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f91f53a-1d38-4fba-b53e-71fe8fa4839d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.976281] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-374a371f-8f28-480f-912b-0347e3a2f522 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.984558] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a74eedeb-74a3-4b81-8f2f-401766f63efc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.998077] env[67131]: DEBUG nova.compute.provider_tree [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1092.006463] env[67131]: DEBUG nova.scheduler.client.report [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1092.020024] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.175s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.020144] env[67131]: ERROR nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.020144] env[67131]: Faults: ['InvalidArgument'] [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Traceback (most recent call last): [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self.driver.spawn(context, instance, image_meta, [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self._fetch_image_if_missing(context, vi) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] image_cache(vi, tmp_image_ds_loc) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] vm_util.copy_virtual_disk( [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] session._wait_for_task(vmdk_copy_task) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return self.wait_for_task(task_ref) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return evt.wait() [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] result = hub.switch() [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] return self.greenlet.switch() [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] self.f(*self.args, **self.kw) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] raise exceptions.translate_fault(task_info.error) [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Faults: ['InvalidArgument'] [ 1092.020144] env[67131]: ERROR nova.compute.manager [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] [ 1092.020950] env[67131]: DEBUG nova.compute.utils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] VimFaultException {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1092.023599] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Build of instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 was re-scheduled: A specified parameter was not correct: fileType [ 1092.023599] env[67131]: Faults: ['InvalidArgument'] {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1092.024032] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1092.024196] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1092.024361] env[67131]: DEBUG nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1092.024516] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1092.070741] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1092.071558] env[67131]: ERROR nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] result = getattr(controller, method)(*args, **kwargs) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._get(image_id) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] resp, body = self.http_client.get(url, headers=header) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.request(url, 'GET', **kwargs) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._handle_response(resp) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exc.from_response(resp, resp.content) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] During handling of the above exception, another exception occurred: [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] yield resources [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.driver.spawn(context, instance, image_meta, [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._fetch_image_if_missing(context, vi) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image_fetch(context, vi, tmp_image_ds_loc) [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] images.fetch_image( [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1092.071558] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] metadata = IMAGE_API.get(context, image_ref) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return session.show(context, image_id, [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] _reraise_translated_image_exception(image_id) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise new_exc.with_traceback(exc_trace) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] result = getattr(controller, method)(*args, **kwargs) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._get(image_id) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] resp, body = self.http_client.get(url, headers=header) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.request(url, 'GET', **kwargs) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._handle_response(resp) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exc.from_response(resp, resp.content) [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1092.073587] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.073587] env[67131]: INFO nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Terminating instance [ 1092.073587] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1092.073587] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.074391] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1092.074391] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.074598] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b31a7ded-0cde-46b3-9f0b-b0a4685edc7d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.077154] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d1360b9-4f00-4722-9ae4-ca772db962d7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.084191] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1092.084398] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-115ba39b-63ce-4bf2-be39-02980d9843bc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.086535] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.086700] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1092.087613] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8bad5732-e709-48c4-9afd-a02b1d5113af {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.092187] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Waiting for the task: (returnval){ [ 1092.092187] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a9aca3-d31b-2cbd-fa71-11cce6386cf9" [ 1092.092187] env[67131]: _type = "Task" [ 1092.092187] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.099119] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52a9aca3-d31b-2cbd-fa71-11cce6386cf9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.151082] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1092.151329] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1092.151473] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Deleting the datastore file [datastore1] aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1092.151715] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cbb0a320-ee07-435d-bb39-985e23336dba {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.157475] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Waiting for the task: (returnval){ [ 1092.157475] env[67131]: value = "task-3456504" [ 1092.157475] env[67131]: _type = "Task" [ 1092.157475] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.164787] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Task: {'id': task-3456504, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.327784] env[67131]: DEBUG nova.network.neutron [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1092.337828] env[67131]: INFO nova.compute.manager [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Took 0.31 seconds to deallocate network for instance. [ 1092.442498] env[67131]: INFO nova.scheduler.client.report [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Deleted allocations for instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 [ 1092.459189] env[67131]: DEBUG oslo_concurrency.lockutils [None req-0b0b5d6d-0112-4996-ab64-ae90700afbd3 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 466.291s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.460210] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 266.687s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1092.460431] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Acquiring lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1092.460632] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1092.460790] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.462874] env[67131]: INFO nova.compute.manager [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Terminating instance [ 1092.464656] env[67131]: DEBUG nova.compute.manager [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1092.464844] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.465416] env[67131]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6ca358bc-8119-4218-9b13-4f6c94275839 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.474125] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85acc95f-793a-4311-9c39-915bf871e715 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.484942] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1092.504727] env[67131]: WARNING nova.virt.vmwareapi.vmops [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7e46e878-1564-4f3b-baa5-5c99d7e04d80 could not be found. [ 1092.504927] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1092.505265] env[67131]: INFO nova.compute.manager [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1092.505364] env[67131]: DEBUG oslo.service.loopingcall [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1092.505570] env[67131]: DEBUG nova.compute.manager [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1092.505666] env[67131]: DEBUG nova.network.neutron [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1092.532437] env[67131]: DEBUG nova.network.neutron [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1092.535600] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1092.535832] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1092.537254] env[67131]: INFO nova.compute.claims [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1092.540183] env[67131]: INFO nova.compute.manager [-] [instance: 7e46e878-1564-4f3b-baa5-5c99d7e04d80] Took 0.03 seconds to deallocate network for instance. [ 1092.605715] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1092.605955] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Creating directory with path [datastore1] vmware_temp/609bd794-e893-43bb-af50-f10025863595/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.606607] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c76f9b67-82e5-43b7-8562-5a77ecfcc9e0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.618949] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Created directory with path [datastore1] vmware_temp/609bd794-e893-43bb-af50-f10025863595/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.619161] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Fetch image to [datastore1] vmware_temp/609bd794-e893-43bb-af50-f10025863595/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1092.619334] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/609bd794-e893-43bb-af50-f10025863595/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1092.620053] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf8058a7-f7db-47c1-a74f-e1bcdb6ab856 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.627670] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fc2efdb-4571-4c7b-a6c1-29f67bd7d97e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.630376] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06d74bbd-eca9-405c-aab8-869256ac8ab2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.635990] env[67131]: DEBUG oslo_concurrency.lockutils [None req-f3eb1216-ba0f-42d9-9fc3-42c4cd04f004 tempest-ServersTestFqdnHostnames-618156096 tempest-ServersTestFqdnHostnames-618156096-project-member] Lock "7e46e878-1564-4f3b-baa5-5c99d7e04d80" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.643678] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98f5655-aaf2-4e13-ae38-7f76f11b11e0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.648865] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83dae22d-39e5-4939-8e62-783d15c85dcf {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.705455] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab315907-9a66-40b6-9c87-3f7115b4217e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.708343] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee7c272-c6b2-4b8e-87d7-fc3322612614 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.718020] env[67131]: DEBUG oslo_vmware.api [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Task: {'id': task-3456504, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073531} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1092.718308] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1092.718377] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1092.718542] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1092.718703] env[67131]: INFO nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1092.720407] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bfbc2ba8-49dd-434c-a74c-61faa1c2932a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.722927] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3843d012-53bb-4ee0-95f5-ecc0596028b9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.726884] env[67131]: DEBUG nova.compute.claims [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1092.727057] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1092.736603] env[67131]: DEBUG nova.compute.provider_tree [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1092.745228] env[67131]: DEBUG nova.scheduler.client.report [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1092.750627] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1092.758044] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.758540] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1092.761010] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.034s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1092.785767] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.786503] env[67131]: DEBUG nova.compute.utils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1092.788304] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1092.788476] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1092.788638] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1092.788802] env[67131]: DEBUG nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1092.788957] env[67131]: DEBUG nova.network.neutron [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1092.791032] env[67131]: DEBUG nova.compute.utils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1092.792287] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1092.792460] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1092.799888] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1092.863854] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1092.865506] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1092.866266] env[67131]: ERROR nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] result = getattr(controller, method)(*args, **kwargs) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._get(image_id) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] resp, body = self.http_client.get(url, headers=header) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.request(url, 'GET', **kwargs) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._handle_response(resp) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exc.from_response(resp, resp.content) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] During handling of the above exception, another exception occurred: [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] yield resources [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.driver.spawn(context, instance, image_meta, [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._fetch_image_if_missing(context, vi) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image_fetch(context, vi, tmp_image_ds_loc) [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] images.fetch_image( [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1092.866266] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] metadata = IMAGE_API.get(context, image_ref) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return session.show(context, image_id, [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] _reraise_translated_image_exception(image_id) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise new_exc.with_traceback(exc_trace) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] result = getattr(controller, method)(*args, **kwargs) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._get(image_id) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] resp, body = self.http_client.get(url, headers=header) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.request(url, 'GET', **kwargs) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._handle_response(resp) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exc.from_response(resp, resp.content) [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1092.867322] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1092.867322] env[67131]: INFO nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Terminating instance [ 1092.868077] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1092.868287] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.868869] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1092.869079] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.869305] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ab2c448a-4a35-4613-a434-fc6a8fc90891 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.871754] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dccb9d79-0459-4b27-881f-4ff1d8569add {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.880998] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1092.881935] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-29da19cc-4fbd-471b-bd49-21a1b6875dc0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.883338] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.883506] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1092.884193] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fe128a9-e8fd-4b75-9e73-566d1012ceac {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.889643] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Waiting for the task: (returnval){ [ 1092.889643] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522944d7-463e-67f6-3925-9cd8739ed63c" [ 1092.889643] env[67131]: _type = "Task" [ 1092.889643] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.900872] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522944d7-463e-67f6-3925-9cd8739ed63c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.904650] env[67131]: DEBUG nova.policy [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f38c4ed248b94f3c92a938189df2e66f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '461ab077e0de4778885ef8f8e0f411ba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 1092.924234] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1092.924648] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1092.924648] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1092.924796] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1092.925695] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1092.925695] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1092.925695] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1092.925695] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1092.925695] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1092.926344] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1092.926344] env[67131]: DEBUG nova.virt.hardware [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1092.926874] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e69272-83e0-4f07-ae12-cf71d49ad447 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.931472] env[67131]: DEBUG neutronclient.v2_0.client [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1092.933173] env[67131]: ERROR nova.compute.manager [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] result = getattr(controller, method)(*args, **kwargs) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._get(image_id) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] resp, body = self.http_client.get(url, headers=header) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.request(url, 'GET', **kwargs) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._handle_response(resp) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exc.from_response(resp, resp.content) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] During handling of the above exception, another exception occurred: [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.driver.spawn(context, instance, image_meta, [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._fetch_image_if_missing(context, vi) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image_fetch(context, vi, tmp_image_ds_loc) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] images.fetch_image( [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] metadata = IMAGE_API.get(context, image_ref) [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return session.show(context, image_id, [ 1092.933173] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] _reraise_translated_image_exception(image_id) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise new_exc.with_traceback(exc_trace) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] result = getattr(controller, method)(*args, **kwargs) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._get(image_id) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] resp, body = self.http_client.get(url, headers=header) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.request(url, 'GET', **kwargs) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._handle_response(resp) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exc.from_response(resp, resp.content) [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] During handling of the above exception, another exception occurred: [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._build_and_run_instance(context, instance, image, [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] with excutils.save_and_reraise_exception(): [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.force_reraise() [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise self.value [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] with self.rt.instance_claim(context, instance, node, allocs, [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.abort() [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1092.934176] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return f(*args, **kwargs) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._unset_instance_host_and_node(instance) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] instance.save() [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] updates, result = self.indirection_api.object_action( [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return cctxt.call(context, 'object_action', objinst=objinst, [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] result = self.transport._send( [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._driver.send(target, ctxt, message, [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise result [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] nova.exception_Remote.InstanceNotFound_Remote: Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 could not be found. [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return getattr(target, method)(*args, **kwargs) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return fn(self, *args, **kwargs) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] old_ref, inst_ref = db.instance_update_and_get_original( [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return f(*args, **kwargs) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] with excutils.save_and_reraise_exception() as ectxt: [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.force_reraise() [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise self.value [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return f(*args, **kwargs) [ 1092.935332] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return f(context, *args, **kwargs) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exception.InstanceNotFound(instance_id=uuid) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] nova.exception.InstanceNotFound: Instance aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5 could not be found. [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] During handling of the above exception, another exception occurred: [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] exception_handler_v20(status_code, error_body) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise client_exc(message=error_message, [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Neutron server returns request_ids: ['req-406480a3-3cb8-4b83-ac29-a25ad806387f'] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] During handling of the above exception, another exception occurred: [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] Traceback (most recent call last): [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._deallocate_network(context, instance, requested_networks) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self.network_api.deallocate_for_instance( [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] data = neutron.list_ports(**search_opts) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.list('ports', self.ports_path, retrieve_all, [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1092.936440] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] for r in self._pagination(collection, path, **params): [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] res = self.get(path, params=params) [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.retry_request("GET", action, body=body, [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] return self.do_request(method, action, body=body, [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] ret = obj(*args, **kwargs) [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] self._handle_fault_response(status_code, replybody, resp) [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] raise exception.Unauthorized() [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] nova.exception.Unauthorized: Not authorized. [ 1092.941119] env[67131]: ERROR nova.compute.manager [instance: aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5] [ 1092.941119] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-681b65f9-f20d-4a35-9e56-ca2b4a1f99dc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.964422] env[67131]: DEBUG oslo_concurrency.lockutils [None req-86646f92-9682-47e1-8f7d-49f4bb0ef122 tempest-ServersAdminTestJSON-1224387270 tempest-ServersAdminTestJSON-1224387270-project-member] Lock "aaa80cb4-a8b2-4f83-97d5-aa1e0e2383f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 404.306s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.974258] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Starting instance... {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1092.986889] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1092.986889] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1092.986889] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Deleting the datastore file [datastore1] c8c85f1c-6876-4632-a2d6-a835912d3285 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1092.986889] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a67ede10-6644-49ba-b26f-d99264591e12 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.994309] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Waiting for the task: (returnval){ [ 1092.994309] env[67131]: value = "task-3456506" [ 1092.994309] env[67131]: _type = "Task" [ 1092.994309] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.000689] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Task: {'id': task-3456506, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.023224] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.023481] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.025051] env[67131]: INFO nova.compute.claims [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1093.116266] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7afab7d-ec4c-4ebf-ba72-8c91ed2eda30 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.123365] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b53936c-164d-4e34-b352-a8e9b7e920fd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.152702] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-414a85ac-487b-4b64-bee5-b805b6545ce5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.160710] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eac3108b-5d04-4182-bff3-6f9f03b3b8d4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.174278] env[67131]: DEBUG nova.compute.provider_tree [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1093.184598] env[67131]: DEBUG nova.scheduler.client.report [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1093.198252] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.175s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.198883] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Start building networks asynchronously for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1093.201847] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Successfully created port: df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1093.237518] env[67131]: DEBUG nova.compute.utils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Using /dev/sd instead of None {{(pid=67131) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1093.238974] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Allocating IP information in the background. {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1093.239167] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] allocate_for_instance() {{(pid=67131) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1093.251111] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Start building block device mappings for instance. {{(pid=67131) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1093.333748] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Start spawning the instance on the hypervisor. {{(pid=67131) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1093.343490] env[67131]: DEBUG nova.policy [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58eca239d2f34739a1b5f5d8f9f54dcc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0868c07f196248f9be808d9e6cf114bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67131) authorize /opt/stack/nova/nova/policy.py:203}} [ 1093.358102] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T07:35:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T07:35:05Z,direct_url=,disk_format='vmdk',id=6f3f0d8a-6139-4366-9e0b-060c24ad2511,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b748d6760ea143ada1c17aae946fd343',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T07:35:05Z,virtual_size=,visibility=), allow threads: False {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1093.358343] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Flavor limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1093.358497] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Image limits 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1093.358676] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Flavor pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1093.358819] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Image pref 0:0:0 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1093.358961] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67131) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1093.359218] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1093.359395] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1093.359560] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Got 1 possible topologies {{(pid=67131) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1093.359719] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1093.359888] env[67131]: DEBUG nova.virt.hardware [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67131) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1093.360732] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-557f9ea5-740f-4312-90d3-efaa26c3923f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.368704] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efa4858f-f759-480b-a5ad-fb90f5415548 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.399711] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1093.399961] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Creating directory with path [datastore1] vmware_temp/b913c703-0855-4020-8ec5-47b72b4c961e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.400268] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27c1272e-d11d-443b-b5df-b36d4dc3ae1e {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.411628] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Created directory with path [datastore1] vmware_temp/b913c703-0855-4020-8ec5-47b72b4c961e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.411814] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Fetch image to [datastore1] vmware_temp/b913c703-0855-4020-8ec5-47b72b4c961e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1093.412021] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/b913c703-0855-4020-8ec5-47b72b4c961e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1093.413176] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f8fdc8a-0de1-4a29-85c1-f3070e3f5198 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.422245] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfa6aa5b-2bf0-4fb2-8dbb-790a99c881be {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.432335] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e46d5847-31ca-44c1-9a01-cd01f01c2ce3 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.465106] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c6b9229-0b7a-4f0e-81eb-b3a32e7aa5cd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.471045] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-715be53a-5bea-4cc6-b52b-7c16fd6291f9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.490876] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1093.502016] env[67131]: DEBUG oslo_vmware.api [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Task: {'id': task-3456506, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.094133} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1093.502016] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1093.502405] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1093.502717] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.503021] env[67131]: INFO nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1093.505356] env[67131]: DEBUG nova.compute.claims [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1093.505809] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.506364] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.532112] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.532261] env[67131]: DEBUG nova.compute.utils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance c8c85f1c-6876-4632-a2d6-a835912d3285 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1093.534242] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1093.534451] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1093.534641] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1093.534827] env[67131]: DEBUG nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1093.535272] env[67131]: DEBUG nova.network.neutron [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.596839] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1093.597854] env[67131]: ERROR nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] result = getattr(controller, method)(*args, **kwargs) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._get(image_id) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] resp, body = self.http_client.get(url, headers=header) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.request(url, 'GET', **kwargs) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._handle_response(resp) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exc.from_response(resp, resp.content) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] During handling of the above exception, another exception occurred: [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] yield resources [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.driver.spawn(context, instance, image_meta, [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._fetch_image_if_missing(context, vi) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image_fetch(context, vi, tmp_image_ds_loc) [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] images.fetch_image( [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1093.597854] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] metadata = IMAGE_API.get(context, image_ref) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return session.show(context, image_id, [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] _reraise_translated_image_exception(image_id) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise new_exc.with_traceback(exc_trace) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] result = getattr(controller, method)(*args, **kwargs) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._get(image_id) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] resp, body = self.http_client.get(url, headers=header) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.request(url, 'GET', **kwargs) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._handle_response(resp) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exc.from_response(resp, resp.content) [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1093.599146] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1093.599146] env[67131]: INFO nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Terminating instance [ 1093.599806] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1093.599806] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.602749] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1093.602749] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1093.602749] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c993a584-4ba4-4853-9dc0-cb3825828069 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.604519] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e756bf-fbe4-456e-a07e-b36e2b7feb47 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.616231] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1093.616483] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-de188bbb-333f-4ec2-8b0c-863f452fb155 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.621992] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.622667] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1093.625028] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-244c2681-c0de-42e0-907c-6f42ca814d08 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.630474] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Waiting for the task: (returnval){ [ 1093.630474] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5217c6a1-ebe8-ed20-2703-c6a976e74a5c" [ 1093.630474] env[67131]: _type = "Task" [ 1093.630474] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.637987] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5217c6a1-ebe8-ed20-2703-c6a976e74a5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.680613] env[67131]: DEBUG neutronclient.v2_0.client [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1093.681635] env[67131]: ERROR nova.compute.manager [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] result = getattr(controller, method)(*args, **kwargs) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._get(image_id) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] resp, body = self.http_client.get(url, headers=header) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.request(url, 'GET', **kwargs) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._handle_response(resp) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exc.from_response(resp, resp.content) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] During handling of the above exception, another exception occurred: [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.driver.spawn(context, instance, image_meta, [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._fetch_image_if_missing(context, vi) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image_fetch(context, vi, tmp_image_ds_loc) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] images.fetch_image( [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] metadata = IMAGE_API.get(context, image_ref) [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return session.show(context, image_id, [ 1093.681635] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] _reraise_translated_image_exception(image_id) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise new_exc.with_traceback(exc_trace) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] result = getattr(controller, method)(*args, **kwargs) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._get(image_id) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] resp, body = self.http_client.get(url, headers=header) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.request(url, 'GET', **kwargs) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._handle_response(resp) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exc.from_response(resp, resp.content) [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] During handling of the above exception, another exception occurred: [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._build_and_run_instance(context, instance, image, [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] with excutils.save_and_reraise_exception(): [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.force_reraise() [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise self.value [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] with self.rt.instance_claim(context, instance, node, allocs, [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.abort() [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1093.682777] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return f(*args, **kwargs) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._unset_instance_host_and_node(instance) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] instance.save() [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] updates, result = self.indirection_api.object_action( [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return cctxt.call(context, 'object_action', objinst=objinst, [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] result = self.transport._send( [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._driver.send(target, ctxt, message, [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise result [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] nova.exception_Remote.InstanceNotFound_Remote: Instance c8c85f1c-6876-4632-a2d6-a835912d3285 could not be found. [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return getattr(target, method)(*args, **kwargs) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return fn(self, *args, **kwargs) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] old_ref, inst_ref = db.instance_update_and_get_original( [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return f(*args, **kwargs) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] with excutils.save_and_reraise_exception() as ectxt: [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.force_reraise() [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise self.value [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return f(*args, **kwargs) [ 1093.683864] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return f(context, *args, **kwargs) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exception.InstanceNotFound(instance_id=uuid) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] nova.exception.InstanceNotFound: Instance c8c85f1c-6876-4632-a2d6-a835912d3285 could not be found. [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] During handling of the above exception, another exception occurred: [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] exception_handler_v20(status_code, error_body) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise client_exc(message=error_message, [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Neutron server returns request_ids: ['req-f0737852-0ec3-4c93-aa74-344fe0bc21a8'] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] During handling of the above exception, another exception occurred: [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] Traceback (most recent call last): [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._deallocate_network(context, instance, requested_networks) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self.network_api.deallocate_for_instance( [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] data = neutron.list_ports(**search_opts) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.list('ports', self.ports_path, retrieve_all, [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1093.685243] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] for r in self._pagination(collection, path, **params): [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] res = self.get(path, params=params) [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.retry_request("GET", action, body=body, [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] return self.do_request(method, action, body=body, [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] ret = obj(*args, **kwargs) [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] self._handle_fault_response(status_code, replybody, resp) [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] raise exception.Unauthorized() [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] nova.exception.Unauthorized: Not authorized. [ 1093.686891] env[67131]: ERROR nova.compute.manager [instance: c8c85f1c-6876-4632-a2d6-a835912d3285] [ 1093.690293] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1093.690605] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1093.690794] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Deleting the datastore file [datastore1] 4ce5668d-b588-4b92-bcc9-11d03eff2a84 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1093.691054] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8eb35594-366e-46fb-9850-18dafecda753 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.698836] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Waiting for the task: (returnval){ [ 1093.698836] env[67131]: value = "task-3456508" [ 1093.698836] env[67131]: _type = "Task" [ 1093.698836] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.706633] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Task: {'id': task-3456508, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.707506] env[67131]: DEBUG oslo_concurrency.lockutils [None req-7069eeef-adcb-45fb-b391-06f9a275812c tempest-ServersTestMultiNic-588121047 tempest-ServersTestMultiNic-588121047-project-member] Lock "c8c85f1c-6876-4632-a2d6-a835912d3285" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 398.035s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.827198] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Successfully created port: c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1094.140787] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1094.141049] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Creating directory with path [datastore1] vmware_temp/c53aea39-4033-45b1-8e85-c146cc303bc4/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.141277] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ecac354-8a67-4c72-a35a-2ecd814a5dbe {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.153296] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Created directory with path [datastore1] vmware_temp/c53aea39-4033-45b1-8e85-c146cc303bc4/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.153486] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Fetch image to [datastore1] vmware_temp/c53aea39-4033-45b1-8e85-c146cc303bc4/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1094.153650] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/c53aea39-4033-45b1-8e85-c146cc303bc4/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1094.154426] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b79c96a-9a46-4b72-ae43-015c4fe3e1e0 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.161101] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f7e0511-44f5-4276-a4b6-584e6e00cc6d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.169918] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0b68d75-8970-4c8a-b6ad-da212d8c4891 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.203607] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d8784a7-5de3-4cca-926a-87793252a660 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.212036] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9d512327-ccfc-4b2b-ad37-2d281b21cf2b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.213663] env[67131]: DEBUG oslo_vmware.api [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Task: {'id': task-3456508, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07147} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1094.213895] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1094.214087] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1094.214257] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1094.214419] env[67131]: INFO nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1094.216633] env[67131]: DEBUG nova.compute.claims [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1094.216801] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.217013] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.237336] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1094.244528] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.245220] env[67131]: DEBUG nova.compute.utils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1094.246780] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1094.246856] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1094.246967] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1094.247146] env[67131]: DEBUG nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1094.247304] env[67131]: DEBUG nova.network.neutron [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1094.272864] env[67131]: DEBUG neutronclient.v2_0.client [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1094.274515] env[67131]: ERROR nova.compute.manager [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] result = getattr(controller, method)(*args, **kwargs) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._get(image_id) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] resp, body = self.http_client.get(url, headers=header) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.request(url, 'GET', **kwargs) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._handle_response(resp) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exc.from_response(resp, resp.content) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] During handling of the above exception, another exception occurred: [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.driver.spawn(context, instance, image_meta, [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._fetch_image_if_missing(context, vi) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] images.fetch_image( [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] metadata = IMAGE_API.get(context, image_ref) [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return session.show(context, image_id, [ 1094.274515] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] _reraise_translated_image_exception(image_id) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise new_exc.with_traceback(exc_trace) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] result = getattr(controller, method)(*args, **kwargs) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._get(image_id) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] resp, body = self.http_client.get(url, headers=header) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.request(url, 'GET', **kwargs) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._handle_response(resp) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exc.from_response(resp, resp.content) [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] During handling of the above exception, another exception occurred: [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._build_and_run_instance(context, instance, image, [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] with excutils.save_and_reraise_exception(): [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.force_reraise() [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise self.value [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] with self.rt.instance_claim(context, instance, node, allocs, [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.abort() [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1094.275967] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return f(*args, **kwargs) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._unset_instance_host_and_node(instance) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] instance.save() [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] updates, result = self.indirection_api.object_action( [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return cctxt.call(context, 'object_action', objinst=objinst, [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] result = self.transport._send( [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._driver.send(target, ctxt, message, [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise result [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] nova.exception_Remote.InstanceNotFound_Remote: Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 could not be found. [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return getattr(target, method)(*args, **kwargs) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return fn(self, *args, **kwargs) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] old_ref, inst_ref = db.instance_update_and_get_original( [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return f(*args, **kwargs) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] with excutils.save_and_reraise_exception() as ectxt: [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.force_reraise() [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise self.value [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return f(*args, **kwargs) [ 1094.277418] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return f(context, *args, **kwargs) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exception.InstanceNotFound(instance_id=uuid) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] nova.exception.InstanceNotFound: Instance 4ce5668d-b588-4b92-bcc9-11d03eff2a84 could not be found. [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] During handling of the above exception, another exception occurred: [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] exception_handler_v20(status_code, error_body) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise client_exc(message=error_message, [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Neutron server returns request_ids: ['req-de484c31-cd42-4a2a-916c-a56347913aea'] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] During handling of the above exception, another exception occurred: [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] Traceback (most recent call last): [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._deallocate_network(context, instance, requested_networks) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self.network_api.deallocate_for_instance( [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] data = neutron.list_ports(**search_opts) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.list('ports', self.ports_path, retrieve_all, [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1094.279689] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] for r in self._pagination(collection, path, **params): [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] res = self.get(path, params=params) [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.retry_request("GET", action, body=body, [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] return self.do_request(method, action, body=body, [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] ret = obj(*args, **kwargs) [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] self._handle_fault_response(status_code, replybody, resp) [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] raise exception.Unauthorized() [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] nova.exception.Unauthorized: Not authorized. [ 1094.280836] env[67131]: ERROR nova.compute.manager [instance: 4ce5668d-b588-4b92-bcc9-11d03eff2a84] [ 1094.283258] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Successfully updated port: df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1094.291507] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquiring lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1094.291636] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquired lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.291782] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1094.297046] env[67131]: DEBUG oslo_concurrency.lockutils [None req-2a033a59-3e33-4d33-9a9d-d5817a64b7c5 tempest-SecurityGroupsTestJSON-1175465788 tempest-SecurityGroupsTestJSON-1175465788-project-member] Lock "4ce5668d-b588-4b92-bcc9-11d03eff2a84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 397.768s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.351192] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1094.352339] env[67131]: ERROR nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] result = getattr(controller, method)(*args, **kwargs) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._get(image_id) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] resp, body = self.http_client.get(url, headers=header) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.request(url, 'GET', **kwargs) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._handle_response(resp) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exc.from_response(resp, resp.content) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] During handling of the above exception, another exception occurred: [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] yield resources [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.driver.spawn(context, instance, image_meta, [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._fetch_image_if_missing(context, vi) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] images.fetch_image( [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.352339] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] metadata = IMAGE_API.get(context, image_ref) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return session.show(context, image_id, [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] _reraise_translated_image_exception(image_id) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise new_exc.with_traceback(exc_trace) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] result = getattr(controller, method)(*args, **kwargs) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._get(image_id) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] resp, body = self.http_client.get(url, headers=header) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.request(url, 'GET', **kwargs) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._handle_response(resp) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exc.from_response(resp, resp.content) [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1094.354298] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1094.354298] env[67131]: INFO nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Terminating instance [ 1094.356249] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.356473] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.356729] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c5ed58c-e9eb-4695-9329-eb16b78a83f2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.360319] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1094.360513] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1094.361323] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9e8154-a065-4b79-b665-9197e885d12b {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.366862] env[67131]: DEBUG nova.compute.manager [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Received event network-vif-plugged-df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1094.366953] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Acquiring lock "0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.367161] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Lock "0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.367328] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Lock "0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.367583] env[67131]: DEBUG nova.compute.manager [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] No waiting events found dispatching network-vif-plugged-df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1094.367759] env[67131]: WARNING nova.compute.manager [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Received unexpected event network-vif-plugged-df990c58-f99e-4287-8cf9-6700ac5252da for instance with vm_state building and task_state spawning. [ 1094.367918] env[67131]: DEBUG nova.compute.manager [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Received event network-changed-df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1094.368085] env[67131]: DEBUG nova.compute.manager [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Refreshing instance network info cache due to event network-changed-df990c58-f99e-4287-8cf9-6700ac5252da. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1094.368249] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Acquiring lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1094.372982] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1094.374117] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b94bb9d6-6283-40b3-9422-dffb9dfb0f78 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.375873] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.376078] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1094.377264] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1094.379241] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-188304b1-5461-45e9-9a8f-e11252987afe {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.384434] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Waiting for the task: (returnval){ [ 1094.384434] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52b566c5-c063-2017-e626-3000955f0b2f" [ 1094.384434] env[67131]: _type = "Task" [ 1094.384434] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.391753] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52b566c5-c063-2017-e626-3000955f0b2f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.442782] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1094.442998] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1094.443188] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Deleting the datastore file [datastore1] 3b2e1650-ee7f-46a2-94db-1a611384be03 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1094.443430] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05679125-e53a-4739-a171-4f82c45625da {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.450842] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Waiting for the task: (returnval){ [ 1094.450842] env[67131]: value = "task-3456510" [ 1094.450842] env[67131]: _type = "Task" [ 1094.450842] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.458638] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Task: {'id': task-3456510, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.852603] env[67131]: DEBUG nova.network.neutron [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Updating instance_info_cache with network_info: [{"id": "df990c58-f99e-4287-8cf9-6700ac5252da", "address": "fa:16:3e:65:34:19", "network": {"id": "5902f393-a975-49df-a0be-ba75257cf4cb", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1816111526-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "461ab077e0de4778885ef8f8e0f411ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "03ac2c9c-6ad2-4a85-bfab-c7e336df859a", "external-id": "nsx-vlan-transportzone-379", "segmentation_id": 379, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdf990c58-f9", "ovs_interfaceid": "df990c58-f99e-4287-8cf9-6700ac5252da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1094.862848] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Releasing lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1094.863147] env[67131]: DEBUG nova.compute.manager [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Instance network_info: |[{"id": "df990c58-f99e-4287-8cf9-6700ac5252da", "address": "fa:16:3e:65:34:19", "network": {"id": "5902f393-a975-49df-a0be-ba75257cf4cb", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1816111526-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "461ab077e0de4778885ef8f8e0f411ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "03ac2c9c-6ad2-4a85-bfab-c7e336df859a", "external-id": "nsx-vlan-transportzone-379", "segmentation_id": 379, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdf990c58-f9", "ovs_interfaceid": "df990c58-f99e-4287-8cf9-6700ac5252da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1094.863707] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Acquired lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.863707] env[67131]: DEBUG nova.network.neutron [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Refreshing network info cache for port df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1094.864738] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:65:34:19', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '03ac2c9c-6ad2-4a85-bfab-c7e336df859a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'df990c58-f99e-4287-8cf9-6700ac5252da', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1094.872246] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Creating folder: Project (461ab077e0de4778885ef8f8e0f411ba). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1094.872997] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-29210b16-5ca0-4f1a-ae90-3b94e7095903 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.889448] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Created folder: Project (461ab077e0de4778885ef8f8e0f411ba) in parent group-v690228. [ 1094.889655] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Creating folder: Instances. Parent ref: group-v690298. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1094.889882] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de393f1e-ede6-498c-930a-a8423c86f408 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.896500] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1094.896727] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Creating directory with path [datastore1] vmware_temp/9f9c9947-427f-414e-b655-6124d887a1cd/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.896921] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c7d968d8-908b-4625-a5b5-55a90b829640 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.899445] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Created folder: Instances in parent group-v690298. [ 1094.899605] env[67131]: DEBUG oslo.service.loopingcall [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1094.900036] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1094.900208] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e2c0081e-938d-46bc-9f44-a7fc012b469a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.916659] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Created directory with path [datastore1] vmware_temp/9f9c9947-427f-414e-b655-6124d887a1cd/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.916842] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Fetch image to [datastore1] vmware_temp/9f9c9947-427f-414e-b655-6124d887a1cd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1094.917010] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/9f9c9947-427f-414e-b655-6124d887a1cd/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1094.917960] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02f4daff-7568-4009-abf0-ad184f261802 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.921636] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1094.921636] env[67131]: value = "task-3456513" [ 1094.921636] env[67131]: _type = "Task" [ 1094.921636] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.928444] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19ecc3c1-b42f-4d9c-ab4c-369943da19f5 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.951514] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456513, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.953112] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc28afda-c8b7-46c4-b72a-dc36a1e7c7a1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.965525] env[67131]: DEBUG oslo_vmware.api [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Task: {'id': task-3456510, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083438} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1094.990124] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1094.990358] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1094.990549] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1094.990741] env[67131]: INFO nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1094.993870] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de3c4cf-f647-4555-98e1-2bb51f5cd17f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.999110] env[67131]: DEBUG nova.compute.claims [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1094.999365] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.999627] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1095.005859] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1978c3ce-915e-4343-b68e-4dd521cfb7fc {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.025916] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1095.028977] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1095.029622] env[67131]: DEBUG nova.compute.utils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1095.031320] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1095.031491] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1095.031653] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1095.031814] env[67131]: DEBUG nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1095.031973] env[67131]: DEBUG nova.network.neutron [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1095.060744] env[67131]: DEBUG neutronclient.v2_0.client [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1095.062507] env[67131]: ERROR nova.compute.manager [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] result = getattr(controller, method)(*args, **kwargs) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._get(image_id) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] resp, body = self.http_client.get(url, headers=header) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.request(url, 'GET', **kwargs) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._handle_response(resp) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exc.from_response(resp, resp.content) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] During handling of the above exception, another exception occurred: [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.driver.spawn(context, instance, image_meta, [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._fetch_image_if_missing(context, vi) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image_fetch(context, vi, tmp_image_ds_loc) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] images.fetch_image( [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] metadata = IMAGE_API.get(context, image_ref) [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return session.show(context, image_id, [ 1095.062507] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] _reraise_translated_image_exception(image_id) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise new_exc.with_traceback(exc_trace) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] result = getattr(controller, method)(*args, **kwargs) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._get(image_id) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] resp, body = self.http_client.get(url, headers=header) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.request(url, 'GET', **kwargs) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._handle_response(resp) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exc.from_response(resp, resp.content) [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] During handling of the above exception, another exception occurred: [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._build_and_run_instance(context, instance, image, [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] with excutils.save_and_reraise_exception(): [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.force_reraise() [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise self.value [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] with self.rt.instance_claim(context, instance, node, allocs, [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.abort() [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1095.067364] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return f(*args, **kwargs) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._unset_instance_host_and_node(instance) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] instance.save() [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] updates, result = self.indirection_api.object_action( [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return cctxt.call(context, 'object_action', objinst=objinst, [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] result = self.transport._send( [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._driver.send(target, ctxt, message, [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise result [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] nova.exception_Remote.InstanceNotFound_Remote: Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 could not be found. [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return getattr(target, method)(*args, **kwargs) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return fn(self, *args, **kwargs) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] old_ref, inst_ref = db.instance_update_and_get_original( [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return f(*args, **kwargs) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] with excutils.save_and_reraise_exception() as ectxt: [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.force_reraise() [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise self.value [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return f(*args, **kwargs) [ 1095.068338] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return f(context, *args, **kwargs) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exception.InstanceNotFound(instance_id=uuid) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] nova.exception.InstanceNotFound: Instance 3b2e1650-ee7f-46a2-94db-1a611384be03 could not be found. [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] During handling of the above exception, another exception occurred: [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] exception_handler_v20(status_code, error_body) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise client_exc(message=error_message, [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Neutron server returns request_ids: ['req-6070b7c3-b88d-4096-9524-da012d5aa311'] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] During handling of the above exception, another exception occurred: [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] Traceback (most recent call last): [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._deallocate_network(context, instance, requested_networks) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self.network_api.deallocate_for_instance( [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] data = neutron.list_ports(**search_opts) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.list('ports', self.ports_path, retrieve_all, [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1095.069511] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] for r in self._pagination(collection, path, **params): [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] res = self.get(path, params=params) [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.retry_request("GET", action, body=body, [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] return self.do_request(method, action, body=body, [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] ret = obj(*args, **kwargs) [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] self._handle_fault_response(status_code, replybody, resp) [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] raise exception.Unauthorized() [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] nova.exception.Unauthorized: Not authorized. [ 1095.072319] env[67131]: ERROR nova.compute.manager [instance: 3b2e1650-ee7f-46a2-94db-1a611384be03] [ 1095.090104] env[67131]: DEBUG oslo_concurrency.lockutils [None req-584544b9-1cfb-4278-89d0-c9761023035e tempest-ServerDiskConfigTestJSON-643728301 tempest-ServerDiskConfigTestJSON-643728301-project-member] Lock "3b2e1650-ee7f-46a2-94db-1a611384be03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 397.762s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1095.123407] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Successfully updated port: c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1095.135731] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquiring lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1095.135850] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquired lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.136424] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Building network info cache for instance {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1095.156784] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.157435] env[67131]: ERROR nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] result = getattr(controller, method)(*args, **kwargs) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._get(image_id) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] resp, body = self.http_client.get(url, headers=header) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.request(url, 'GET', **kwargs) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._handle_response(resp) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exc.from_response(resp, resp.content) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] During handling of the above exception, another exception occurred: [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] yield resources [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.driver.spawn(context, instance, image_meta, [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._fetch_image_if_missing(context, vi) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image_fetch(context, vi, tmp_image_ds_loc) [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] images.fetch_image( [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1095.157435] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] metadata = IMAGE_API.get(context, image_ref) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return session.show(context, image_id, [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] _reraise_translated_image_exception(image_id) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise new_exc.with_traceback(exc_trace) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] result = getattr(controller, method)(*args, **kwargs) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._get(image_id) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] resp, body = self.http_client.get(url, headers=header) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.request(url, 'GET', **kwargs) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._handle_response(resp) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exc.from_response(resp, resp.content) [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1095.159030] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.159030] env[67131]: INFO nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Terminating instance [ 1095.160198] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.160198] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1095.161239] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1095.161453] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1095.161699] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2307b5f4-c1f6-415a-8970-e2c5d517cf29 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.164589] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76a908d1-1827-4d88-bddc-5bed4f35dd34 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.174778] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1095.175757] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c53781af-90d5-40d0-968c-e64f84b47521 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.177399] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1095.177595] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1095.178293] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c65410f-3544-4f45-8c37-e1250d235ace {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.183449] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 1095.183449] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52605b11-d92f-9d13-ef74-85f5c8779a6e" [ 1095.183449] env[67131]: _type = "Task" [ 1095.183449] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.190897] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52605b11-d92f-9d13-ef74-85f5c8779a6e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.233565] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Instance cache missing network info. {{(pid=67131) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1095.258755] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1095.258970] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1095.259169] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Deleting the datastore file [datastore1] c5368926-ed52-414f-9342-27c71e4e3557 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1095.259426] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e25febd7-f8f7-4e8c-bc22-77deb05d2478 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.265993] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Waiting for the task: (returnval){ [ 1095.265993] env[67131]: value = "task-3456515" [ 1095.265993] env[67131]: _type = "Task" [ 1095.265993] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.273992] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Task: {'id': task-3456515, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.430585] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456513, 'name': CreateVM_Task, 'duration_secs': 0.310689} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1095.430756] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1095.431390] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1095.431549] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.431855] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1095.432092] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f899749-eaf6-4bf2-b39a-e8f5a6e02335 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.436242] env[67131]: DEBUG oslo_vmware.api [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Waiting for the task: (returnval){ [ 1095.436242] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52930f25-2044-11d0-d24b-7a4f4f0b707e" [ 1095.436242] env[67131]: _type = "Task" [ 1095.436242] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.443123] env[67131]: DEBUG oslo_vmware.api [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]52930f25-2044-11d0-d24b-7a4f4f0b707e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.462178] env[67131]: DEBUG nova.network.neutron [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Updated VIF entry in instance network info cache for port df990c58-f99e-4287-8cf9-6700ac5252da. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1095.462495] env[67131]: DEBUG nova.network.neutron [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Updating instance_info_cache with network_info: [{"id": "df990c58-f99e-4287-8cf9-6700ac5252da", "address": "fa:16:3e:65:34:19", "network": {"id": "5902f393-a975-49df-a0be-ba75257cf4cb", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1816111526-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "461ab077e0de4778885ef8f8e0f411ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "03ac2c9c-6ad2-4a85-bfab-c7e336df859a", "external-id": "nsx-vlan-transportzone-379", "segmentation_id": 379, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdf990c58-f9", "ovs_interfaceid": "df990c58-f99e-4287-8cf9-6700ac5252da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.471480] env[67131]: DEBUG oslo_concurrency.lockutils [req-acf1215b-ead4-49b1-a372-382923c2d7ef req-721c7b33-0fd8-41db-8cc5-327f07156879 service nova] Releasing lock "refresh_cache-0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.503513] env[67131]: DEBUG nova.network.neutron [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Updating instance_info_cache with network_info: [{"id": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "address": "fa:16:3e:61:86:7b", "network": {"id": "6976c12b-0e06-40e0-a5b9-2b13a3ee7be1", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2143250681-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0868c07f196248f9be808d9e6cf114bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4f32e5c-b0", "ovs_interfaceid": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.515090] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Releasing lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.515378] env[67131]: DEBUG nova.compute.manager [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Instance network_info: |[{"id": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "address": "fa:16:3e:61:86:7b", "network": {"id": "6976c12b-0e06-40e0-a5b9-2b13a3ee7be1", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2143250681-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0868c07f196248f9be808d9e6cf114bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4f32e5c-b0", "ovs_interfaceid": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67131) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1095.515721] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:86:7b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4b43a78-f49b-4132-ab2e-6e28769a9498', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c', 'vif_model': 'vmxnet3'}] {{(pid=67131) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1095.523447] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Creating folder: Project (0868c07f196248f9be808d9e6cf114bc). Parent ref: group-v690228. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.523882] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-84bc2da5-6266-46de-85a1-834db629ec6c {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.534101] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Created folder: Project (0868c07f196248f9be808d9e6cf114bc) in parent group-v690228. [ 1095.534266] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Creating folder: Instances. Parent ref: group-v690301. {{(pid=67131) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.534481] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3f0e24fa-3441-48a1-b8a4-567b528a7050 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.694211] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1095.694447] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating directory with path [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1095.694705] env[67131]: INFO nova.virt.vmwareapi.vm_util [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Created folder: Instances in parent group-v690301. [ 1095.694903] env[67131]: DEBUG oslo.service.loopingcall [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67131) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1095.695102] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e9802815-ef55-4715-b475-e81df25fb847 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.696845] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Creating VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1095.697056] env[67131]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-94daaee1-5e06-42b5-bd87-81dd48b7ebd6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.715449] env[67131]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1095.715449] env[67131]: value = "task-3456518" [ 1095.715449] env[67131]: _type = "Task" [ 1095.715449] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.722370] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456518, 'name': CreateVM_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.728931] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Created directory with path [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1095.729133] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Fetch image to [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1095.729306] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1095.729975] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c19b2382-38bc-4c4f-ad63-d55dfd20a449 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.735950] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49072f8c-855a-489a-89f2-7fce3a0818fd {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.745008] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c85916a-4720-4c1d-8cfb-9f2b306c1065 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.779554] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0b67317-db2e-43be-87ee-5053f30e0abe {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.786539] env[67131]: DEBUG oslo_vmware.api [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Task: {'id': task-3456515, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075922} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1095.787995] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1095.788221] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1095.788396] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1095.788544] env[67131]: INFO nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1095.790325] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-edc091ab-6c89-49e7-9bb8-4df70bf43062 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.792197] env[67131]: DEBUG nova.compute.claims [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1095.792364] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1095.792568] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1095.820050] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1095.822929] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1095.823638] env[67131]: DEBUG nova.compute.utils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance c5368926-ed52-414f-9342-27c71e4e3557 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1095.825156] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1095.825329] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1095.825509] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1095.825697] env[67131]: DEBUG nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1095.825870] env[67131]: DEBUG nova.network.neutron [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1095.917258] env[67131]: DEBUG oslo_vmware.rw_handles [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1095.974740] env[67131]: DEBUG neutronclient.v2_0.client [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1095.976266] env[67131]: ERROR nova.compute.manager [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] [instance: c5368926-ed52-414f-9342-27c71e4e3557] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] result = getattr(controller, method)(*args, **kwargs) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._get(image_id) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] resp, body = self.http_client.get(url, headers=header) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.request(url, 'GET', **kwargs) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._handle_response(resp) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exc.from_response(resp, resp.content) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] During handling of the above exception, another exception occurred: [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.driver.spawn(context, instance, image_meta, [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._fetch_image_if_missing(context, vi) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image_fetch(context, vi, tmp_image_ds_loc) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] images.fetch_image( [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] metadata = IMAGE_API.get(context, image_ref) [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return session.show(context, image_id, [ 1095.976266] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] _reraise_translated_image_exception(image_id) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise new_exc.with_traceback(exc_trace) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] result = getattr(controller, method)(*args, **kwargs) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._get(image_id) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] resp, body = self.http_client.get(url, headers=header) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.request(url, 'GET', **kwargs) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._handle_response(resp) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exc.from_response(resp, resp.content) [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] During handling of the above exception, another exception occurred: [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._build_and_run_instance(context, instance, image, [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] with excutils.save_and_reraise_exception(): [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.force_reraise() [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise self.value [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] with self.rt.instance_claim(context, instance, node, allocs, [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.abort() [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1095.977353] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return f(*args, **kwargs) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._unset_instance_host_and_node(instance) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] instance.save() [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] updates, result = self.indirection_api.object_action( [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return cctxt.call(context, 'object_action', objinst=objinst, [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] result = self.transport._send( [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._driver.send(target, ctxt, message, [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise result [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] nova.exception_Remote.InstanceNotFound_Remote: Instance c5368926-ed52-414f-9342-27c71e4e3557 could not be found. [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return getattr(target, method)(*args, **kwargs) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return fn(self, *args, **kwargs) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] old_ref, inst_ref = db.instance_update_and_get_original( [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return f(*args, **kwargs) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] with excutils.save_and_reraise_exception() as ectxt: [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.force_reraise() [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise self.value [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return f(*args, **kwargs) [ 1095.978388] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return f(context, *args, **kwargs) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exception.InstanceNotFound(instance_id=uuid) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] nova.exception.InstanceNotFound: Instance c5368926-ed52-414f-9342-27c71e4e3557 could not be found. [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] During handling of the above exception, another exception occurred: [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] exception_handler_v20(status_code, error_body) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise client_exc(message=error_message, [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Neutron server returns request_ids: ['req-e9604702-ecb8-47dd-878e-c3009a68203b'] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] During handling of the above exception, another exception occurred: [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] Traceback (most recent call last): [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._deallocate_network(context, instance, requested_networks) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self.network_api.deallocate_for_instance( [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] data = neutron.list_ports(**search_opts) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.list('ports', self.ports_path, retrieve_all, [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1095.979693] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] for r in self._pagination(collection, path, **params): [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] res = self.get(path, params=params) [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.retry_request("GET", action, body=body, [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] return self.do_request(method, action, body=body, [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] ret = obj(*args, **kwargs) [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] self._handle_fault_response(status_code, replybody, resp) [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] raise exception.Unauthorized() [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] nova.exception.Unauthorized: Not authorized. [ 1095.980850] env[67131]: ERROR nova.compute.manager [instance: c5368926-ed52-414f-9342-27c71e4e3557] [ 1095.982510] env[67131]: DEBUG oslo_vmware.rw_handles [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1095.982733] env[67131]: DEBUG oslo_vmware.rw_handles [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1095.988605] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.988991] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1095.989224] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e34687d5-7fde-4fd1-a025-982b8db99c4a tempest-ServerPasswordTestJSON-452473640 tempest-ServerPasswordTestJSON-452473640-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1096.000062] env[67131]: DEBUG oslo_concurrency.lockutils [None req-e8ecec02-696d-4cfc-80f8-b92638420283 tempest-AttachInterfacesUnderV243Test-1218726166 tempest-AttachInterfacesUnderV243Test-1218726166-project-member] Lock "c5368926-ed52-414f-9342-27c71e4e3557" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 385.608s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1096.225349] env[67131]: DEBUG oslo_vmware.api [-] Task: {'id': task-3456518, 'name': CreateVM_Task, 'duration_secs': 0.263044} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1096.225500] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Created VM on the ESX host {{(pid=67131) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1096.226127] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1096.226297] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1096.226616] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1096.226847] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9b3095b9-42e6-485d-9f20-394b7daf39c9 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.231500] env[67131]: DEBUG oslo_vmware.api [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Waiting for the task: (returnval){ [ 1096.231500] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5290cb29-85a4-9319-8c6a-653f5da40e1b" [ 1096.231500] env[67131]: _type = "Task" [ 1096.231500] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.245941] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1096.246181] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Processing image 6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1096.246387] env[67131]: DEBUG oslo_concurrency.lockutils [None req-79899d80-743a-4ebf-816b-3efb65b5e353 tempest-AttachVolumeShelveTestJSON-2126008604 tempest-AttachVolumeShelveTestJSON-2126008604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1096.477835] env[67131]: DEBUG nova.compute.manager [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Received event network-vif-plugged-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1096.478063] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Acquiring lock "8f4044dd-2e1d-40cb-99f9-96c9584bfe5e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1096.478264] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Lock "8f4044dd-2e1d-40cb-99f9-96c9584bfe5e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1096.478428] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Lock "8f4044dd-2e1d-40cb-99f9-96c9584bfe5e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1096.478587] env[67131]: DEBUG nova.compute.manager [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] No waiting events found dispatching network-vif-plugged-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1096.478749] env[67131]: WARNING nova.compute.manager [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Received unexpected event network-vif-plugged-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c for instance with vm_state building and task_state spawning. [ 1096.478906] env[67131]: DEBUG nova.compute.manager [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Received event network-changed-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1096.479075] env[67131]: DEBUG nova.compute.manager [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Refreshing instance network info cache due to event network-changed-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c. {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1096.479261] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Acquiring lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1096.479394] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Acquired lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1096.479543] env[67131]: DEBUG nova.network.neutron [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Refreshing network info cache for port c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1096.703770] env[67131]: DEBUG nova.network.neutron [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Updated VIF entry in instance network info cache for port c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c. {{(pid=67131) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1096.704226] env[67131]: DEBUG nova.network.neutron [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Updating instance_info_cache with network_info: [{"id": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "address": "fa:16:3e:61:86:7b", "network": {"id": "6976c12b-0e06-40e0-a5b9-2b13a3ee7be1", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2143250681-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0868c07f196248f9be808d9e6cf114bc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc4f32e5c-b0", "ovs_interfaceid": "c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1096.713310] env[67131]: DEBUG oslo_concurrency.lockutils [req-985abea8-e7e7-4f7a-9f7e-8d5cdb6132cb req-98d2ece7-9995-4af3-a823-0a0e1b0c1971 service nova] Releasing lock "refresh_cache-8f4044dd-2e1d-40cb-99f9-96c9584bfe5e" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1102.215221] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1102.215589] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67131) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1103.215511] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.215620] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.215998] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Starting heal instance info cache {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1105.215998] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Rebuilding the list of instances to heal {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1105.228932] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.229105] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.229241] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Skipping network cache update for instance because it is Building. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.229371] env[67131]: DEBUG nova.compute.manager [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Didn't find any instances for network info cache update. {{(pid=67131) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1106.215116] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.215398] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.215584] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.216803] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager.update_available_resource {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.228485] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1107.228738] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1107.228848] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1107.228990] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67131) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1107.230054] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4930ae29-3de6-4d0d-bb66-28b85a0b40ae {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.238933] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5d391a-10ad-42c8-955d-306331278688 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.252430] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8de312fc-a691-4e3c-b911-4884f42d619a {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.258401] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8f24572-1a9b-4c64-9b28-b0f2fb9e0c43 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.286529] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180874MB free_disk=95GB free_vcpus=48 pci_devices=None {{(pid=67131) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1107.286690] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1107.286871] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1107.327803] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 2fd6ec26-9e42-43fd-a09c-de43a8107aee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1107.327951] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1107.328090] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Instance 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67131) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1107.328259] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1107.328396] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=67131) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1107.372393] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c69b9994-d7b4-4ac0-9a76-a6468ac91ed6 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.381831] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-565baa8c-8013-4651-aa21-87e00f7ddbc7 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.411559] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c05be1-6bc5-467d-8001-b645ed56d675 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.418270] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cb109a-ac88-487e-bf62-9d5c1dc4d259 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.430954] env[67131]: DEBUG nova.compute.provider_tree [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed in ProviderTree for provider: d05f24fe-4395-4079-99ef-1ac1245f55e5 {{(pid=67131) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1107.439448] env[67131]: DEBUG nova.scheduler.client.report [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Inventory has not changed for provider d05f24fe-4395-4079-99ef-1ac1245f55e5 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 95, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67131) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1107.453652] env[67131]: DEBUG nova.compute.resource_tracker [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67131) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1107.453817] env[67131]: DEBUG oslo_concurrency.lockutils [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1108.453703] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1109.215565] env[67131]: DEBUG oslo_service.periodic_task [None req-4069ad72-fa06-4547-8754-92aad801728f None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67131) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.676191] env[67131]: DEBUG nova.compute.manager [req-f7161479-0a95-4091-b5d2-695583a9be27 req-097218d9-6369-4f6e-a2b4-8e61605b83f1 service nova] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Received event network-vif-deleted-52989d3c-b2b5-4dff-a7cf-b28c6a9f0080 {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1126.694127] env[67131]: DEBUG nova.compute.manager [req-6dfc3f98-d48d-4995-9476-8bb2aaf091f3 req-6ea72722-56a9-4743-8d6a-c000bd07d13d service nova] [instance: 0a65b6d2-ee6f-4dc0-8981-2ddc00a1e912] Received event network-vif-deleted-df990c58-f99e-4287-8cf9-6700ac5252da {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1128.718545] env[67131]: DEBUG nova.compute.manager [req-e48b8d12-604c-4e6f-b1b4-c0e3acc33550 req-484ae03e-83e1-461d-9985-cb8860dc0897 service nova] [instance: 8f4044dd-2e1d-40cb-99f9-96c9584bfe5e] Received event network-vif-deleted-c4f32e5c-b0e1-4eb6-8dc5-7dd900494c8c {{(pid=67131) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1145.838692] env[67131]: WARNING oslo_vmware.rw_handles [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles response.begin() [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1145.838692] env[67131]: ERROR oslo_vmware.rw_handles [ 1145.839524] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Downloaded image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1145.840910] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Caching image {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1145.841181] env[67131]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Copying Virtual Disk [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk to [datastore1] vmware_temp/597097db-4121-46a1-8dd3-e3a68c37fa72/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk {{(pid=67131) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1145.841475] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a9d8ff4-f9ed-4599-983c-55c0e08bb833 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.849550] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 1145.849550] env[67131]: value = "task-3456519" [ 1145.849550] env[67131]: _type = "Task" [ 1145.849550] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1145.857293] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456519, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.361752] env[67131]: DEBUG oslo_vmware.exceptions [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Fault InvalidArgument not matched. {{(pid=67131) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1146.362031] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1146.362549] env[67131]: ERROR nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1146.362549] env[67131]: Faults: ['InvalidArgument'] [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Traceback (most recent call last): [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] yield resources [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] self.driver.spawn(context, instance, image_meta, [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] self._fetch_image_if_missing(context, vi) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] image_cache(vi, tmp_image_ds_loc) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] vm_util.copy_virtual_disk( [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] session._wait_for_task(vmdk_copy_task) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] return self.wait_for_task(task_ref) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] return evt.wait() [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] result = hub.switch() [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] return self.greenlet.switch() [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] self.f(*self.args, **self.kw) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] raise exceptions.translate_fault(task_info.error) [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Faults: ['InvalidArgument'] [ 1146.362549] env[67131]: ERROR nova.compute.manager [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] [ 1146.363599] env[67131]: INFO nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Terminating instance [ 1146.364384] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1146.364596] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.364839] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9700c0d6-c714-4a95-92ca-a38f918eca26 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.366953] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1146.367158] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1146.367847] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f6de29-6809-4ada-be2c-df9e1825303d {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.374606] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1146.374828] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c92e8a03-860e-4486-b8ef-542eb6825195 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.376831] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.377053] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1146.377969] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e47beeda-e30a-4d19-918d-58c3ab4a6303 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.382522] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Waiting for the task: (returnval){ [ 1146.382522] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521470c3-64bf-1998-bb23-9e1d3216e4e1" [ 1146.382522] env[67131]: _type = "Task" [ 1146.382522] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.389927] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]521470c3-64bf-1998-bb23-9e1d3216e4e1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.449817] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1146.450033] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1146.450198] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Deleting the datastore file [datastore1] 765e5c4e-c893-41d2-9087-43294f24f5c3 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1146.450454] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f635074b-276d-4cd8-a8e6-ca94d4c5eefb {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.456723] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Waiting for the task: (returnval){ [ 1146.456723] env[67131]: value = "task-3456521" [ 1146.456723] env[67131]: _type = "Task" [ 1146.456723] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.464184] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456521, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.892969] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1146.893361] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Creating directory with path [datastore1] vmware_temp/125f4a5e-05b0-4ba3-9ab4-e7648feefd5e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.893462] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6de606f3-cddc-4c46-b6a5-bee2b5d43a84 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.905335] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Created directory with path [datastore1] vmware_temp/125f4a5e-05b0-4ba3-9ab4-e7648feefd5e/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.905513] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Fetch image to [datastore1] vmware_temp/125f4a5e-05b0-4ba3-9ab4-e7648feefd5e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1146.905683] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/125f4a5e-05b0-4ba3-9ab4-e7648feefd5e/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1146.906445] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f149178f-93f4-4ff5-8243-f08803e6b884 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.912611] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9b556a3-d36f-417e-993b-823f57198cff {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.921362] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b015e5b-2bc3-492b-b1a6-581d28056e02 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.951331] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de32db4-818f-4eac-bb75-854d9b2664ce {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.956840] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-22e2b1bb-6152-4f89-a62f-e7ebc1783768 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.965492] env[67131]: DEBUG oslo_vmware.api [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Task: {'id': task-3456521, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081506} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1146.965717] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1146.965919] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1146.966120] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1146.966287] env[67131]: INFO nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1146.968308] env[67131]: DEBUG nova.compute.claims [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1146.968473] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.968673] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.983502] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1146.996756] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.997475] env[67131]: DEBUG nova.compute.utils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance 765e5c4e-c893-41d2-9087-43294f24f5c3 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1146.998893] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1146.999069] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1146.999234] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1146.999384] env[67131]: DEBUG nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1146.999543] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1147.030773] env[67131]: DEBUG nova.network.neutron [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Updating instance_info_cache with network_info: [] {{(pid=67131) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1147.040343] env[67131]: INFO nova.compute.manager [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] [instance: 765e5c4e-c893-41d2-9087-43294f24f5c3] Took 0.04 seconds to deallocate network for instance. [ 1147.083892] env[67131]: DEBUG oslo_concurrency.lockutils [None req-cc0089a4-2b32-45c9-8574-fb57a6cce532 tempest-DeleteServersAdminTestJSON-1355762682 tempest-DeleteServersAdminTestJSON-1355762682-project-member] Lock "765e5c4e-c893-41d2-9087-43294f24f5c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 338.122s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.158634] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.159430] env[67131]: ERROR nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] result = getattr(controller, method)(*args, **kwargs) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._get(image_id) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] resp, body = self.http_client.get(url, headers=header) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.request(url, 'GET', **kwargs) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._handle_response(resp) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exc.from_response(resp, resp.content) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] During handling of the above exception, another exception occurred: [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] yield resources [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.driver.spawn(context, instance, image_meta, [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._fetch_image_if_missing(context, vi) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image_fetch(context, vi, tmp_image_ds_loc) [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] images.fetch_image( [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1147.159430] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] metadata = IMAGE_API.get(context, image_ref) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return session.show(context, image_id, [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] _reraise_translated_image_exception(image_id) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise new_exc.with_traceback(exc_trace) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] result = getattr(controller, method)(*args, **kwargs) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._get(image_id) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] resp, body = self.http_client.get(url, headers=header) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.request(url, 'GET', **kwargs) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._handle_response(resp) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exc.from_response(resp, resp.content) [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1147.160914] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.160914] env[67131]: INFO nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Terminating instance [ 1147.162210] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.162210] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.162210] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.162386] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.162474] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53e4af71-480e-449d-beb2-bbda4231af55 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.165430] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7baa7e1e-4851-4acd-9b0e-af917af68b43 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.172858] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.173121] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c1c26f4c-ef71-40a5-b9b7-2b3a5e70bdef {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.175583] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.175752] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.176741] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1799fa19-1616-4407-bddb-24f22accf051 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.181569] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Waiting for the task: (returnval){ [ 1147.181569] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5288e1ec-e278-96f3-0c66-6f8960019bd8" [ 1147.181569] env[67131]: _type = "Task" [ 1147.181569] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.193586] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]5288e1ec-e278-96f3-0c66-6f8960019bd8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.232054] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1147.232290] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1147.232475] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Deleting the datastore file [datastore1] 2778d965-ad71-4239-b03a-214cd11b08ed {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1147.232759] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d7a911cf-9c62-4e25-b669-898f993cb3ec {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.238207] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Waiting for the task: (returnval){ [ 1147.238207] env[67131]: value = "task-3456523" [ 1147.238207] env[67131]: _type = "Task" [ 1147.238207] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.245424] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Task: {'id': task-3456523, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.690881] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.691141] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Creating directory with path [datastore1] vmware_temp/fe7ac151-7a6f-4d2d-9384-cba28f33243c/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.691365] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5c30f09-aa01-4381-a294-0f924f8b7202 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.703387] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Created directory with path [datastore1] vmware_temp/fe7ac151-7a6f-4d2d-9384-cba28f33243c/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.703603] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Fetch image to [datastore1] vmware_temp/fe7ac151-7a6f-4d2d-9384-cba28f33243c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.703854] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/fe7ac151-7a6f-4d2d-9384-cba28f33243c/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.704641] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbcf48a2-a9ec-4af2-aae2-6c560a70bc91 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.711112] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a000bd45-4a25-4a28-aed7-8a69be27a779 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.719799] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95f41b6-e18b-4357-8e99-511c82b2f366 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.753496] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6f64d90-7090-4d78-8d0d-73dbb433818f {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.760450] env[67131]: DEBUG oslo_vmware.api [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Task: {'id': task-3456523, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.142011} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1147.761746] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1147.761930] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1147.762109] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1147.762282] env[67131]: INFO nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1147.763982] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bc513e3d-4b7c-4fe6-b841-4e9a0b758029 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.765795] env[67131]: DEBUG nova.compute.claims [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1147.765989] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.766219] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.788720] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.791549] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.792169] env[67131]: DEBUG nova.compute.utils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance 2778d965-ad71-4239-b03a-214cd11b08ed could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1147.793530] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1147.793695] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1147.793851] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1147.794029] env[67131]: DEBUG nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1147.794191] env[67131]: DEBUG nova.network.neutron [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1147.878648] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.879464] env[67131]: ERROR nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] result = getattr(controller, method)(*args, **kwargs) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._get(image_id) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] resp, body = self.http_client.get(url, headers=header) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.request(url, 'GET', **kwargs) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._handle_response(resp) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exc.from_response(resp, resp.content) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] During handling of the above exception, another exception occurred: [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] yield resources [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.driver.spawn(context, instance, image_meta, [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._fetch_image_if_missing(context, vi) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image_fetch(context, vi, tmp_image_ds_loc) [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] images.fetch_image( [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1147.879464] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] metadata = IMAGE_API.get(context, image_ref) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return session.show(context, image_id, [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] _reraise_translated_image_exception(image_id) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise new_exc.with_traceback(exc_trace) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] result = getattr(controller, method)(*args, **kwargs) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._get(image_id) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] resp, body = self.http_client.get(url, headers=header) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.request(url, 'GET', **kwargs) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._handle_response(resp) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exc.from_response(resp, resp.content) [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1147.880493] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1147.880493] env[67131]: INFO nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Terminating instance [ 1147.881218] env[67131]: DEBUG oslo_concurrency.lockutils [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f3f0d8a-6139-4366-9e0b-060c24ad2511/6f3f0d8a-6139-4366-9e0b-060c24ad2511.vmdk" {{(pid=67131) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.881425] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.882037] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Start destroying the instance on the hypervisor. {{(pid=67131) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.882228] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Destroying instance {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.882468] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0bf07bde-8a0c-4c9d-a6b0-b65750918040 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.884934] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfcd5bfd-41dd-4bb2-9a01-d88db00f3f49 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.892186] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Unregistering the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.892405] env[67131]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89577aa9-43c9-4571-a441-c2d8bd1c97b1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.894549] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.894774] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67131) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.895686] env[67131]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08808c55-f7b2-43dc-b132-322d72e9b6d2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.900947] env[67131]: DEBUG oslo_vmware.api [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Waiting for the task: (returnval){ [ 1147.900947] env[67131]: value = "session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522d108b-26cf-64fb-bb40-a06cf21b9933" [ 1147.900947] env[67131]: _type = "Task" [ 1147.900947] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.903591] env[67131]: DEBUG neutronclient.v2_0.client [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1147.905093] env[67131]: ERROR nova.compute.manager [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] result = getattr(controller, method)(*args, **kwargs) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._get(image_id) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] resp, body = self.http_client.get(url, headers=header) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.request(url, 'GET', **kwargs) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._handle_response(resp) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exc.from_response(resp, resp.content) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] During handling of the above exception, another exception occurred: [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.driver.spawn(context, instance, image_meta, [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._fetch_image_if_missing(context, vi) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image_fetch(context, vi, tmp_image_ds_loc) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] images.fetch_image( [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] metadata = IMAGE_API.get(context, image_ref) [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return session.show(context, image_id, [ 1147.905093] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] _reraise_translated_image_exception(image_id) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise new_exc.with_traceback(exc_trace) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] result = getattr(controller, method)(*args, **kwargs) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._get(image_id) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] resp, body = self.http_client.get(url, headers=header) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.request(url, 'GET', **kwargs) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._handle_response(resp) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exc.from_response(resp, resp.content) [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] During handling of the above exception, another exception occurred: [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._build_and_run_instance(context, instance, image, [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] with excutils.save_and_reraise_exception(): [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.force_reraise() [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise self.value [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] with self.rt.instance_claim(context, instance, node, allocs, [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.abort() [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1147.906181] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return f(*args, **kwargs) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._unset_instance_host_and_node(instance) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] instance.save() [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] updates, result = self.indirection_api.object_action( [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return cctxt.call(context, 'object_action', objinst=objinst, [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] result = self.transport._send( [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._driver.send(target, ctxt, message, [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise result [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] nova.exception_Remote.InstanceNotFound_Remote: Instance 2778d965-ad71-4239-b03a-214cd11b08ed could not be found. [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return getattr(target, method)(*args, **kwargs) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return fn(self, *args, **kwargs) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] old_ref, inst_ref = db.instance_update_and_get_original( [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return f(*args, **kwargs) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] with excutils.save_and_reraise_exception() as ectxt: [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.force_reraise() [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise self.value [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return f(*args, **kwargs) [ 1147.907468] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return f(context, *args, **kwargs) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exception.InstanceNotFound(instance_id=uuid) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] nova.exception.InstanceNotFound: Instance 2778d965-ad71-4239-b03a-214cd11b08ed could not be found. [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] During handling of the above exception, another exception occurred: [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] exception_handler_v20(status_code, error_body) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise client_exc(message=error_message, [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Neutron server returns request_ids: ['req-de5ea01d-8a9f-4f1b-a3eb-d8a80c8a4966'] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] During handling of the above exception, another exception occurred: [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] Traceback (most recent call last): [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._deallocate_network(context, instance, requested_networks) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self.network_api.deallocate_for_instance( [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] data = neutron.list_ports(**search_opts) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.list('ports', self.ports_path, retrieve_all, [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1147.908798] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] for r in self._pagination(collection, path, **params): [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] res = self.get(path, params=params) [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.retry_request("GET", action, body=body, [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] return self.do_request(method, action, body=body, [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] ret = obj(*args, **kwargs) [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] self._handle_fault_response(status_code, replybody, resp) [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] raise exception.Unauthorized() [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] nova.exception.Unauthorized: Not authorized. [ 1147.909992] env[67131]: ERROR nova.compute.manager [instance: 2778d965-ad71-4239-b03a-214cd11b08ed] [ 1147.910875] env[67131]: DEBUG oslo_vmware.api [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Task: {'id': session[524f7e81-b62e-04c7-b6df-33e9d86bc9e5]522d108b-26cf-64fb-bb40-a06cf21b9933, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.925668] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ea1309a9-8a7a-4e06-8425-87d2c96a061c tempest-ServersTestManualDisk-793186305 tempest-ServersTestManualDisk-793186305-project-member] Lock "2778d965-ad71-4239-b03a-214cd11b08ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 332.353s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.114574] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Unregistered the VM {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1148.114790] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Deleting contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1148.115032] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Deleting the datastore file [datastore1] 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1148.115300] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b104882c-261b-4ef6-9798-7ee33643f389 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.122496] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Waiting for the task: (returnval){ [ 1148.122496] env[67131]: value = "task-3456525" [ 1148.122496] env[67131]: _type = "Task" [ 1148.122496] env[67131]: } to complete. {{(pid=67131) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1148.129656] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Task: {'id': task-3456525, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.410825] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Preparing fetch location {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1148.411032] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Creating directory with path [datastore1] vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.411259] env[67131]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-824b22e5-898e-4ca6-8e07-6ac078631ef4 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.422181] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Created directory with path [datastore1] vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511 {{(pid=67131) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.422356] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Fetch image to [datastore1] vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk {{(pid=67131) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1148.422524] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to [datastore1] vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk on the data store datastore1 {{(pid=67131) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1148.423199] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3dea668-8400-4292-9fba-e3d018c515f2 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.429647] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56175664-4632-47dd-aa7c-8b901b76e929 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.438505] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa61bef0-9716-4c39-89d5-834a901639e1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.469088] env[67131]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d586969-6373-4e02-bf9d-6c4c0f34ffca {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.474146] env[67131]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-da2f8337-066c-4e9b-ad50-283d5a16a7c1 {{(pid=67131) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.493794] env[67131]: DEBUG nova.virt.vmwareapi.images [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] [instance: 2fd6ec26-9e42-43fd-a09c-de43a8107aee] Downloading image file data 6f3f0d8a-6139-4366-9e0b-060c24ad2511 to the data store datastore1 {{(pid=67131) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1148.537635] env[67131]: DEBUG oslo_vmware.rw_handles [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1148.594926] env[67131]: DEBUG oslo_vmware.rw_handles [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Completed reading data from the image iterator. {{(pid=67131) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1148.595134] env[67131]: DEBUG oslo_vmware.rw_handles [None req-208c35c2-2549-4a50-99f9-b1ff9f2cbe1e tempest-DeleteServersTestJSON-1481289198 tempest-DeleteServersTestJSON-1481289198-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e04f570-5029-430a-b051-98c849089661/6f3f0d8a-6139-4366-9e0b-060c24ad2511/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67131) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1148.631958] env[67131]: DEBUG oslo_vmware.api [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Task: {'id': task-3456525, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071708} completed successfully. {{(pid=67131) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1148.632230] env[67131]: DEBUG nova.virt.vmwareapi.ds_util [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Deleted the datastore file {{(pid=67131) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1148.632412] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Deleted contents of the VM from datastore datastore1 {{(pid=67131) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1148.632580] env[67131]: DEBUG nova.virt.vmwareapi.vmops [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance destroyed {{(pid=67131) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.632758] env[67131]: INFO nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Took 0.75 seconds to destroy the instance on the hypervisor. [ 1148.634801] env[67131]: DEBUG nova.compute.claims [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Aborting claim: {{(pid=67131) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1148.634983] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.635214] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.657906] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.658542] env[67131]: DEBUG nova.compute.utils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 could not be found. {{(pid=67131) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1148.659797] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Instance disappeared during build. {{(pid=67131) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1148.659961] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Unplugging VIFs for instance {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1148.660134] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67131) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1148.660297] env[67131]: DEBUG nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Deallocating network for instance {{(pid=67131) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.660456] env[67131]: DEBUG nova.network.neutron [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] deallocate_for_instance() {{(pid=67131) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.746661] env[67131]: DEBUG neutronclient.v2_0.client [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67131) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1148.748176] env[67131]: ERROR nova.compute.manager [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] result = getattr(controller, method)(*args, **kwargs) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._get(image_id) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] resp, body = self.http_client.get(url, headers=header) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.request(url, 'GET', **kwargs) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._handle_response(resp) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exc.from_response(resp, resp.content) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] During handling of the above exception, another exception occurred: [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.driver.spawn(context, instance, image_meta, [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._fetch_image_if_missing(context, vi) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image_fetch(context, vi, tmp_image_ds_loc) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] images.fetch_image( [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] metadata = IMAGE_API.get(context, image_ref) [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return session.show(context, image_id, [ 1148.748176] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] _reraise_translated_image_exception(image_id) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise new_exc.with_traceback(exc_trace) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] result = getattr(controller, method)(*args, **kwargs) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._get(image_id) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] resp, body = self.http_client.get(url, headers=header) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.request(url, 'GET', **kwargs) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._handle_response(resp) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exc.from_response(resp, resp.content) [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] nova.exception.ImageNotAuthorized: Not authorized for image 6f3f0d8a-6139-4366-9e0b-060c24ad2511. [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] During handling of the above exception, another exception occurred: [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._build_and_run_instance(context, instance, image, [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] with excutils.save_and_reraise_exception(): [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.force_reraise() [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise self.value [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] with self.rt.instance_claim(context, instance, node, allocs, [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.abort() [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1148.749417] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return f(*args, **kwargs) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._unset_instance_host_and_node(instance) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] instance.save() [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] updates, result = self.indirection_api.object_action( [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return cctxt.call(context, 'object_action', objinst=objinst, [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] result = self.transport._send( [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._driver.send(target, ctxt, message, [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise result [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] nova.exception_Remote.InstanceNotFound_Remote: Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 could not be found. [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return getattr(target, method)(*args, **kwargs) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return fn(self, *args, **kwargs) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] old_ref, inst_ref = db.instance_update_and_get_original( [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return f(*args, **kwargs) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] with excutils.save_and_reraise_exception() as ectxt: [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.force_reraise() [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise self.value [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return f(*args, **kwargs) [ 1148.750531] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return f(context, *args, **kwargs) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exception.InstanceNotFound(instance_id=uuid) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] nova.exception.InstanceNotFound: Instance 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585 could not be found. [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] During handling of the above exception, another exception occurred: [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] exception_handler_v20(status_code, error_body) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise client_exc(message=error_message, [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Neutron server returns request_ids: ['req-da405561-33a0-4a13-8eb2-79510c6d39cf'] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] During handling of the above exception, another exception occurred: [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] Traceback (most recent call last): [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._deallocate_network(context, instance, requested_networks) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self.network_api.deallocate_for_instance( [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] data = neutron.list_ports(**search_opts) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.list('ports', self.ports_path, retrieve_all, [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1148.751765] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] for r in self._pagination(collection, path, **params): [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] res = self.get(path, params=params) [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.retry_request("GET", action, body=body, [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] return self.do_request(method, action, body=body, [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] ret = obj(*args, **kwargs) [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] self._handle_fault_response(status_code, replybody, resp) [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] raise exception.Unauthorized() [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] nova.exception.Unauthorized: Not authorized. [ 1148.753041] env[67131]: ERROR nova.compute.manager [instance: 52c4c672-9e1a-42b9-9cf8-ca6de5a6b585] [ 1148.768176] env[67131]: DEBUG oslo_concurrency.lockutils [None req-ff1e4a68-9440-4c5b-a821-4732b0e91633 tempest-ServerMetadataNegativeTestJSON-305136459 tempest-ServerMetadataNegativeTestJSON-305136459-project-member] Lock "52c4c672-9e1a-42b9-9cf8-ca6de5a6b585" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 330.759s {{(pid=67131) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}