[ 500.580884] env[60125]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 501.230305] env[60175]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 502.811090] env[60175]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60175) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.811523] env[60175]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60175) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.811523] env[60175]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60175) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 502.811825] env[60175]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 502.812991] env[60175]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 502.929425] env[60175]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60175) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 502.940016] env[60175]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=60175) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 503.041274] env[60175]: INFO nova.virt.driver [None req-cf6493fa-b019-47e8-bca5-c694baa72678 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 503.114227] env[60175]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 503.114424] env[60175]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 503.114491] env[60175]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60175) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 506.323888] env[60175]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-5e5ca643-837e-4211-a43d-8d262b5c8fe3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.340205] env[60175]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60175) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 506.340360] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-5b64dc40-af47-4a33-9437-70fd3c0290ac {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.366370] env[60175]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 67655. [ 506.366500] env[60175]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.252s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.367181] env[60175]: INFO nova.virt.vmwareapi.driver [None req-cf6493fa-b019-47e8-bca5-c694baa72678 None None] VMware vCenter version: 7.0.3 [ 506.370562] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-423f285b-bf8b-43ee-bcff-cdf0af8c2b73 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.387440] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ace2562e-1a84-4b7e-a68c-b3e206251758 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.393099] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81db395-b411-4d18-83c2-86d8f400708d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.399462] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b79b4953-71c0-4ed9-8405-315416c2c1cc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.412985] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e11b739f-3d5e-4ae2-b5f2-7ef9c1248c8b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.418753] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c499e209-634d-41f6-a199-23d607b0b0ae {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.448591] env[60175]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-fcbd475b-dbc2-4474-8b57-39ac7929b4f6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.453607] env[60175]: DEBUG nova.virt.vmwareapi.driver [None req-cf6493fa-b019-47e8-bca5-c694baa72678 None None] Extension org.openstack.compute already exists. {{(pid=60175) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 506.456273] env[60175]: INFO nova.compute.provider_config [None req-cf6493fa-b019-47e8-bca5-c694baa72678 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 506.474379] env[60175]: DEBUG nova.context [None req-cf6493fa-b019-47e8-bca5-c694baa72678 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e3885d83-7df6-4250-a13b-6a1c0495dd3b(cell1) {{(pid=60175) load_cells /opt/stack/nova/nova/context.py:464}} [ 506.476279] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.476497] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.477295] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.477636] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Acquiring lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.477831] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.478794] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Lock "e3885d83-7df6-4250-a13b-6a1c0495dd3b" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.491151] env[60175]: DEBUG oslo_db.sqlalchemy.engines [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60175) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 506.491538] env[60175]: DEBUG oslo_db.sqlalchemy.engines [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60175) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 506.498917] env[60175]: ERROR nova.db.main.api [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 506.498917] env[60175]: result = function(*args, **kwargs) [ 506.498917] env[60175]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 506.498917] env[60175]: return func(*args, **kwargs) [ 506.498917] env[60175]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 506.498917] env[60175]: result = fn(*args, **kwargs) [ 506.498917] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 506.498917] env[60175]: return f(*args, **kwargs) [ 506.498917] env[60175]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 506.498917] env[60175]: return db.service_get_minimum_version(context, binaries) [ 506.498917] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 506.498917] env[60175]: _check_db_access() [ 506.498917] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 506.498917] env[60175]: stacktrace = ''.join(traceback.format_stack()) [ 506.498917] env[60175]: [ 506.499708] env[60175]: ERROR nova.db.main.api [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 506.499708] env[60175]: result = function(*args, **kwargs) [ 506.499708] env[60175]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 506.499708] env[60175]: return func(*args, **kwargs) [ 506.499708] env[60175]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 506.499708] env[60175]: result = fn(*args, **kwargs) [ 506.499708] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 506.499708] env[60175]: return f(*args, **kwargs) [ 506.499708] env[60175]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 506.499708] env[60175]: return db.service_get_minimum_version(context, binaries) [ 506.499708] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 506.499708] env[60175]: _check_db_access() [ 506.499708] env[60175]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 506.499708] env[60175]: stacktrace = ''.join(traceback.format_stack()) [ 506.499708] env[60175]: [ 506.500056] env[60175]: WARNING nova.objects.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 506.500214] env[60175]: WARNING nova.objects.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Failed to get minimum service version for cell e3885d83-7df6-4250-a13b-6a1c0495dd3b [ 506.500630] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Acquiring lock "singleton_lock" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 506.500789] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Acquired lock "singleton_lock" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 506.501040] env[60175]: DEBUG oslo_concurrency.lockutils [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Releasing lock "singleton_lock" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 506.501378] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Full set of CONF: {{(pid=60175) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 506.501521] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ******************************************************************************** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 506.501647] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] Configuration options gathered from: {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 506.501785] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 506.501971] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 506.502152] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ================================================================================ {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 506.502375] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] allow_resize_to_same_host = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.502543] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] arq_binding_timeout = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.502673] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] backdoor_port = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.502797] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] backdoor_socket = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.502961] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] block_device_allocate_retries = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503138] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] block_device_allocate_retries_interval = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503304] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cert = self.pem {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503469] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503635] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute_monitors = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503838] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] config_dir = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.503966] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] config_drive_format = iso9660 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504114] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504279] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] config_source = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504446] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] console_host = devstack {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504610] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] control_exchange = nova {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504774] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cpu_allocation_ratio = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.504926] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] daemon = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.505126] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] debug = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.505302] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] default_access_ip_network_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.505467] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] default_availability_zone = nova {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.505622] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] default_ephemeral_format = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.505858] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506033] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] default_schedule_zone = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506197] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] disk_allocation_ratio = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506366] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] enable_new_services = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506536] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] enabled_apis = ['osapi_compute'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506696] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] enabled_ssl_apis = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.506875] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] flat_injected = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507055] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] force_config_drive = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507220] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] force_raw_images = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507387] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] graceful_shutdown_timeout = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507545] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] heal_instance_info_cache_interval = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507756] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] host = cpu-1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.507950] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.508131] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] initial_disk_allocation_ratio = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.508303] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] initial_ram_allocation_ratio = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.508520] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.508688] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_build_timeout = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.508857] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_delete_interval = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509036] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_format = [instance: %(uuid)s] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509215] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_name_template = instance-%08x {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509375] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_usage_audit = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509544] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_usage_audit_period = month {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509703] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.509867] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] instances_path = /opt/stack/data/nova/instances {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510046] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] internal_service_availability_zone = internal {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510208] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] key = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510371] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] live_migration_retry_count = 30 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510534] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_config_append = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510700] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.510876] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_dir = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511061] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511195] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_options = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511357] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_rotate_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511525] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_rotate_interval_type = days {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511688] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] log_rotation_type = none {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511817] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.511970] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512113] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512293] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512429] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512592] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] long_rpc_timeout = 1800 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512752] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_concurrent_builds = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.512908] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_concurrent_live_migrations = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513077] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_concurrent_snapshots = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513238] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_local_block_devices = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513396] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_logfile_count = 30 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513552] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] max_logfile_size_mb = 200 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513712] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] maximum_instance_delete_attempts = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.513939] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metadata_listen = 0.0.0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514125] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metadata_listen_port = 8775 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514309] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metadata_workers = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514474] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] migrate_max_retries = -1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514641] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] mkisofs_cmd = genisoimage {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514847] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] my_block_storage_ip = 10.180.1.21 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.514982] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] my_ip = 10.180.1.21 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.515193] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] network_allocate_retries = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.515335] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.515503] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] osapi_compute_listen = 0.0.0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.515667] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] osapi_compute_listen_port = 8774 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.515836] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] osapi_compute_unique_server_name_scope = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516013] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] osapi_compute_workers = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516182] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] password_length = 12 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516342] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] periodic_enable = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516503] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] periodic_fuzzy_delay = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516671] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] pointer_model = usbtablet {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.516891] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] preallocate_images = none {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517073] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] publish_errors = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517214] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] pybasedir = /opt/stack/nova {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517376] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ram_allocation_ratio = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517540] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rate_limit_burst = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517708] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rate_limit_except_level = CRITICAL {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.517870] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rate_limit_interval = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518070] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reboot_timeout = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518295] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reclaim_instance_interval = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518463] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] record = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518637] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reimage_timeout_per_gb = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518804] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] report_interval = 120 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.518966] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rescue_timeout = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519158] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reserved_host_cpus = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519319] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reserved_host_disk_mb = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519478] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reserved_host_memory_mb = 512 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519634] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] reserved_huge_pages = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519795] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] resize_confirm_window = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.519960] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] resize_fs_using_block_device = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520154] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] resume_guests_state_on_host_boot = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520327] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520489] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rpc_response_timeout = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520647] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] run_external_periodic_tasks = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520814] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] running_deleted_instance_action = reap {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.520974] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] running_deleted_instance_poll_interval = 1800 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521148] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] running_deleted_instance_timeout = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521336] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler_instance_sync_interval = 120 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521474] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_down_time = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521647] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] servicegroup_driver = db {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521808] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] shelved_offload_time = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.521970] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] shelved_poll_interval = 3600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.522177] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] shutdown_timeout = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.522317] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] source_is_ipv6 = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.522475] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ssl_only = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.522722] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.522892] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] sync_power_state_interval = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523096] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] sync_power_state_pool_size = 1000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523282] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] syslog_log_facility = LOG_USER {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523442] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] tempdir = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523603] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] timeout_nbd = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523772] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] transport_url = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.523935] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] update_resources_interval = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524111] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_cow_images = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524273] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_eventlog = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524431] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_journal = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524588] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_json = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524745] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_rootwrap_daemon = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.524904] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_stderr = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525075] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] use_syslog = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525233] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vcpu_pin_set = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525401] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plugging_is_fatal = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525565] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plugging_timeout = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525728] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] virt_mkfs = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.525891] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] volume_usage_poll_interval = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.526087] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] watch_log_file = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.526275] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] web = /usr/share/spice-html5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 506.526463] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_concurrency.disable_process_locking = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.526763] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.526977] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.527191] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.527383] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.527556] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.527721] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.527905] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.auth_strategy = keystone {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528121] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.compute_link_prefix = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528307] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528481] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.dhcp_domain = novalocal {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528650] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.enable_instance_password = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528812] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.glance_link_prefix = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.528971] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.529199] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.529383] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.instance_list_per_project_cells = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.529549] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.list_records_by_skipping_down_cells = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.529709] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.local_metadata_per_cell = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.529878] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.max_limit = 1000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530058] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.metadata_cache_expiration = 15 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530238] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.neutron_default_tenant_id = default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530402] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.use_forwarded_for = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530565] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.use_neutron_default_nets = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530732] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.530894] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531072] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531249] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531418] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_dynamic_targets = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531582] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_jsonfile_path = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531762] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.531955] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.backend = dogpile.cache.memcached {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532171] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.backend_argument = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532339] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.config_prefix = cache.oslo {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532504] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.dead_timeout = 60.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532669] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.debug_cache_backend = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532831] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.enable_retry_client = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.532992] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.enable_socket_keepalive = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.533178] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.enabled = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.533412] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.expiration_time = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.533597] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.hashclient_retry_attempts = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.533765] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.hashclient_retry_delay = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.533932] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_dead_retry = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534141] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_password = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534279] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534444] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534607] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_pool_maxsize = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534767] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.534954] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_sasl_enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.535181] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.535421] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_socket_timeout = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.535524] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.memcache_username = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.535688] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.proxies = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.535849] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.retry_attempts = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536015] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.retry_delay = 0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536182] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.socket_keepalive_count = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536340] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.socket_keepalive_idle = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536497] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.socket_keepalive_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536652] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.tls_allowed_ciphers = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536806] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.tls_cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.536996] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.tls_certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.537189] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.tls_enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.537347] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cache.tls_keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.537512] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.537685] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.auth_type = password {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.537872] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538083] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.catalog_info = volumev3::publicURL {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538250] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538414] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538574] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.cross_az_attach = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538735] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.debug = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.538896] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.endpoint_template = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539069] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.http_retries = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539247] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539406] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539577] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.os_region_name = RegionOne {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539741] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.539902] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cinder.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540084] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540249] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.cpu_dedicated_set = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540415] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.cpu_shared_set = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540581] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.image_type_exclude_list = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540743] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.540925] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.max_concurrent_disk_ops = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.541147] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.max_disk_devices_to_attach = -1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.541320] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.541492] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.541656] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.resource_provider_association_refresh = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.541820] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.shutdown_retry_interval = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542018] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542198] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] conductor.workers = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542380] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] console.allowed_origins = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542531] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] console.ssl_ciphers = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542702] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] console.ssl_minimum_version = default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.542873] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] consoleauth.token_ttl = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543050] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543225] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543407] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543568] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543726] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.543898] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544093] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544260] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544421] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544577] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544733] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.544889] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545071] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.service_type = accelerator {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545239] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545398] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545555] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545710] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.545890] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.546064] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] cyborg.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.546254] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.backend = sqlalchemy {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.546448] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.connection = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.546620] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.connection_debug = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.546790] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.connection_parameters = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547009] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.connection_recycle_time = 3600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547203] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.connection_trace = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547368] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.db_inc_retry_interval = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547535] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.db_max_retries = 20 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547697] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.db_max_retry_interval = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.547862] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.db_retry_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548046] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.max_overflow = 50 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548214] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.max_pool_size = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548381] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.max_retries = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548544] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.mysql_enable_ndb = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548716] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.548877] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.mysql_wsrep_sync_wait = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549051] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.pool_timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549266] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.retry_interval = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549443] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.slave_connection = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549611] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.sqlite_synchronous = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549774] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] database.use_db_reconnect = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.549979] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.backend = sqlalchemy {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.550196] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.connection = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.550371] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.connection_debug = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.550543] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.connection_parameters = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.550706] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.connection_recycle_time = 3600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.550873] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.connection_trace = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551046] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.db_inc_retry_interval = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551217] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.db_max_retries = 20 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551379] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.db_max_retry_interval = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551541] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.db_retry_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551708] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.max_overflow = 50 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.551869] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.max_pool_size = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.552045] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.max_retries = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.552212] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.mysql_enable_ndb = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.552382] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.552540] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.552698] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.pool_timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.554424] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.retry_interval = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.554621] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.slave_connection = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.554805] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] api_database.sqlite_synchronous = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.554992] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] devices.enabled_mdev_types = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.555197] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.555400] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ephemeral_storage_encryption.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.555583] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.555753] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.api_servers = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.555922] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556123] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556304] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556471] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556630] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556796] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.debug = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.556996] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.default_trusted_certificate_ids = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.557189] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.enable_certificate_validation = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.557357] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.enable_rbd_download = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.557520] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.557690] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.557874] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558060] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558226] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558392] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.num_retries = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558564] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.rbd_ceph_conf = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558727] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.rbd_connect_timeout = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.558898] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.rbd_pool = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.559109] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.rbd_user = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.559450] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.559627] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.559803] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.service_type = image {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.559969] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560145] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560305] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560464] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560645] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560809] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.verify_glance_signatures = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.560968] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] glance.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561149] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] guestfs.debug = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561323] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.config_drive_cdrom = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561487] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.config_drive_inject_password = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561651] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561812] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.enable_instance_metrics_collection = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.561974] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.enable_remotefx = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.562182] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.instances_path_share = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.562360] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.iscsi_initiator_list = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.562527] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.limit_cpu_features = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.562690] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.562851] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563030] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.power_state_check_timeframe = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563196] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563367] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563531] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.use_multipath_io = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563692] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.volume_attach_retry_count = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.563851] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.564015] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.vswitch_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.564183] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.564348] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] mks.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.564709] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.564932] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.manager_interval = 2400 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.565149] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.precache_concurrency = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.565340] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.remove_unused_base_images = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.565514] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.565682] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.565860] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] image_cache.subdirectory_name = _base {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566047] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.api_max_retries = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566220] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.api_retry_interval = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566381] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566544] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.auth_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566704] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.566892] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567078] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567249] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567445] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567612] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567776] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.567967] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568148] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568309] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568470] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.partition_key = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568636] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.peer_list = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568795] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.568957] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.serial_console_state_timeout = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.569132] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.569300] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.service_type = baremetal {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.569463] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.569624] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.569905] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.570103] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.570303] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ironic.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] key_manager.fixed_key = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.barbican_api_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571584] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.barbican_endpoint = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571753] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.barbican_endpoint_type = public {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571753] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.barbican_region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.571891] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572094] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572281] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572451] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572613] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572779] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.number_of_retries = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.572940] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.retry_delay = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573121] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.send_service_user_token = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573289] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573447] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573607] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.verify_ssl = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573767] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican.verify_ssl_path = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.573963] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574117] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.auth_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574275] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574429] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574592] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574756] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.574938] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575129] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575293] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] barbican_service_user.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575460] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.approle_role_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575620] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.approle_secret_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575779] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.575937] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576110] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576275] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576432] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576602] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.kv_mountpoint = secret {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576764] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.kv_version = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.576977] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.namespace = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.577175] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.root_token_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.577340] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.577499] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.ssl_ca_crt_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.577656] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.577819] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.use_ssl = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578023] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578203] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578365] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578530] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578689] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.578849] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579014] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579184] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579362] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579534] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579688] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579844] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.579998] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.580188] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.service_type = identity {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.580350] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.580510] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.580671] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.580827] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581014] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581184] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] keystone.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581381] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.connection_uri = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581542] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_mode = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581708] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_model_extra_flags = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.581873] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_models = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582056] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_power_governor_high = performance {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582233] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_power_governor_low = powersave {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582396] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_power_management = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582569] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582734] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.device_detach_attempts = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.582895] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.device_detach_timeout = 20 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583067] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.disk_cachemodes = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583258] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.disk_prefix = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583431] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.enabled_perf_events = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583594] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.file_backed_memory = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583755] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.gid_maps = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.583911] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.hw_disk_discard = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584081] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.hw_machine_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584257] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_rbd_ceph_conf = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584423] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584599] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584763] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_rbd_glance_store_name = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.584958] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_rbd_pool = rbd {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585152] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_type = default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585312] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.images_volume_group = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585472] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.inject_key = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585633] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.inject_partition = -2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585793] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.inject_password = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.585957] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.iscsi_iface = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586133] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.iser_use_multipath = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586298] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_bandwidth = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586461] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586620] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_downtime = 500 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586780] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.586968] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.587149] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_inbound_addr = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.587310] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.587468] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_permit_post_copy = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.587639] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_scheme = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.587821] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_timeout_action = abort {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588011] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_tunnelled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588190] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_uri = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588355] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.live_migration_with_native_tls = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588513] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.max_queues = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588679] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.588841] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.nfs_mount_options = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.589210] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.589391] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.589559] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_iser_scan_tries = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.589720] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_memory_encrypted_guests = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.589883] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.590061] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_pcie_ports = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.590230] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.num_volume_scan_tries = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.590395] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.pmem_namespaces = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.590553] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.quobyte_client_cfg = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.590845] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591032] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rbd_connect_timeout = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591203] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591387] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591561] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rbd_secret_uuid = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591722] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rbd_user = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.591886] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592069] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.remote_filesystem_transport = ssh {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592232] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rescue_image_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592387] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rescue_kernel_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592544] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rescue_ramdisk_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592716] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.592871] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.rx_queue_size = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.593051] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.smbfs_mount_options = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.593338] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.593514] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.snapshot_compression = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.593678] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.snapshot_image_format = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.593902] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594088] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.sparse_logical_volumes = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594258] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.swtpm_enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594432] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.swtpm_group = tss {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594601] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.swtpm_user = tss {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594770] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.sysinfo_serial = unique {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.594950] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.tx_queue_size = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595139] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.uid_maps = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595307] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.use_virtio_for_bridges = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595480] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.virt_type = kvm {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595648] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.volume_clear = zero {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595812] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.volume_clear_size = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.595981] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.volume_use_multipath = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.596174] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_cache_path = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.596325] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.596496] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_mount_group = qemu {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.596662] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_mount_opts = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.596863] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.597160] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.597343] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.vzstorage_mount_user = stack {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.597514] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.597690] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.597861] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.auth_type = password {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598059] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598231] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598396] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598555] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598715] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.598888] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.default_floating_pool = public {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599061] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599237] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.extension_sync_interval = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599395] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.http_retries = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599559] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599718] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.599876] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600056] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600216] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600384] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.ovs_bridge = br-int {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600548] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.physnets = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600718] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.region_name = RegionOne {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.600887] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.service_metadata_proxy = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601092] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601289] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.service_type = network {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601456] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601616] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601773] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.601932] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602128] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602292] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] neutron.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602464] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] notifications.bdms_in_notifications = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602642] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] notifications.default_level = INFO {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602815] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] notifications.notification_format = unversioned {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.602981] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] notifications.notify_on_state_change = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.603172] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.603355] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] pci.alias = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.603550] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] pci.device_spec = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.603717] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] pci.report_in_placement = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.603893] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604079] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.auth_type = password {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604252] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604414] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604573] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604742] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.604900] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605074] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605236] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.default_domain_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605394] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.default_domain_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605549] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.domain_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605708] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.domain_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.605867] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606043] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606205] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606363] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606518] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606685] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.password = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.606867] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.project_domain_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607063] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.project_domain_name = Default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607237] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.project_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607411] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.project_name = service {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607579] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.region_name = RegionOne {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607738] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.607913] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.service_type = placement {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.608105] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.608306] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.608569] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.608842] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.system_scope = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.609135] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.609426] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.trust_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.609699] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.user_domain_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.609996] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.user_domain_name = Default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.610226] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.user_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.610416] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.username = placement {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.610606] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.610775] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] placement.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.610958] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.cores = 20 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.611145] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.count_usage_from_placement = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.611321] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.611499] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.injected_file_content_bytes = 10240 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.611672] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.injected_file_path_length = 255 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.611839] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.injected_files = 5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612016] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.instances = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612190] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.key_pairs = 100 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612360] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.metadata_items = 128 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612528] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.ram = 51200 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612692] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.recheck_quota = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.612861] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.server_group_members = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.613044] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] quota.server_groups = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.613255] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rdp.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.613574] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.613764] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.613935] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614118] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.image_metadata_prefilter = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614288] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614455] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.max_attempts = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614620] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.max_placement_results = 1000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614786] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.614961] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.query_placement_for_availability_zone = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.615124] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.query_placement_for_image_type_support = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.615293] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.615494] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] scheduler.workers = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.615678] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.615857] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616050] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616233] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616400] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616565] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616729] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.616943] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617135] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.host_subset_size = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617303] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617467] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617634] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.isolated_hosts = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617800] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.isolated_images = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.617963] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618141] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618304] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.pci_in_placement = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618464] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618625] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618784] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.618942] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619120] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619290] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619459] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.track_instance_changes = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619637] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619818] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metrics.required = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.619976] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metrics.weight_multiplier = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.620152] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.620318] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] metrics.weight_setting = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.620617] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.620794] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.620972] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.port_range = 10000:20000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.621159] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.621330] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.621503] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] serial_console.serialproxy_port = 6083 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.621671] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.621845] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.auth_type = password {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622025] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622186] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622350] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622515] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622670] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.622838] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.send_service_user_token = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.623011] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.623181] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] service_user.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.623351] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.agent_enabled = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.623527] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.623818] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624017] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624196] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.html5proxy_port = 6082 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624359] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.image_compression = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624521] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.jpeg_compression = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624681] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.playback_compression = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.624876] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.server_listen = 127.0.0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625085] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625256] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.streaming_mode = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625419] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] spice.zlib_compression = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625587] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] upgrade_levels.baseapi = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625746] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] upgrade_levels.cert = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.625917] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] upgrade_levels.compute = auto {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626091] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] upgrade_levels.conductor = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626261] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] upgrade_levels.scheduler = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626430] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626593] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.auth_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626750] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.626939] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.627123] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.627291] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.627467] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.627644] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.627808] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vendordata_dynamic_auth.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628015] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.api_retry_count = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628192] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.ca_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628367] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.cache_prefix = devstack-image-cache {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628536] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.cluster_name = testcl1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628703] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.connection_pool_size = 10 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.628865] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.console_delay_seconds = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629098] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.datastore_regex = ^datastore.* {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629286] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629425] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.host_password = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629592] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.host_port = 443 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629764] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.host_username = administrator@vsphere.local {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.629939] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.insecure = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630115] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.integration_bridge = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630284] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.maximum_objects = 100 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630446] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.pbm_default_policy = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630609] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.pbm_enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630768] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.pbm_wsdl_location = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.630939] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631115] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.serial_port_proxy_uri = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631275] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.serial_port_service_uri = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631440] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.task_poll_interval = 0.5 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631613] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.use_linked_clone = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631782] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.vnc_keymap = en-us {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.631950] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.vnc_port = 5900 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.632127] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vmware.vnc_port_total = 10000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.632314] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.auth_schemes = ['none'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.632489] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.632776] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.632963] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633151] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.novncproxy_port = 6080 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633333] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.server_listen = 127.0.0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633507] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633665] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.vencrypt_ca_certs = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633823] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.vencrypt_client_cert = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.633980] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vnc.vencrypt_client_key = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.634174] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.634340] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_deep_image_inspection = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.634502] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.634663] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.634833] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635034] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.disable_rootwrap = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635210] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.enable_numa_live_migration = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635375] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635538] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635699] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.635862] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.libvirt_disable_apic = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636038] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636209] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636371] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636532] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636694] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.636880] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637098] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637279] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637443] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637609] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637908] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.637973] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.client_socket_timeout = 900 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638138] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.default_pool_size = 1000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638309] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.keep_alive = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638477] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.max_header_line = 16384 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638642] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.secure_proxy_ssl_header = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638805] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.ssl_ca_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.638968] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.ssl_cert_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.639145] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.ssl_key_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.639314] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.tcp_keepidle = 600 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.639514] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.639696] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] zvm.ca_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.639860] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] zvm.cloud_connector_url = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.640166] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.640342] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] zvm.reachable_timeout = 300 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.640525] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.enforce_new_defaults = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.640695] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.enforce_scope = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.640870] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.policy_default_rule = default {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641073] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641253] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.policy_file = policy.yaml {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641428] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641590] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641750] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.641909] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642081] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642255] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642430] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642609] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.connection_string = messaging:// {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642775] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.enabled = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.642944] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.es_doc_type = notification {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643124] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.es_scroll_size = 10000 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643301] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.es_scroll_time = 2m {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643460] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.filter_error_trace = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643629] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.hmac_keys = SECRET_KEY {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643795] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.sentinel_service_name = mymaster {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.643967] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.socket_timeout = 0.1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.644151] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] profiler.trace_sqlalchemy = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.644317] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] remote_debug.host = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.644477] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] remote_debug.port = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.644654] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.644830] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645018] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645188] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645352] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645511] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645671] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645833] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.645993] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.646165] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.646335] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.646501] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.646671] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.646861] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647054] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647238] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647405] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647566] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647729] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.647918] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648102] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648276] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648438] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648598] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648763] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.648955] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.649178] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.649373] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.649541] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.649716] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.649888] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_rabbit.ssl_version = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650091] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650264] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_notifications.retry = -1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650449] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650624] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_messaging_notifications.transport_url = **** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650795] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.auth_section = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.650958] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.auth_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.651131] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.cafile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.651333] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.certfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.651517] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.collect_timing = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.651677] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.connect_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.651836] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.connect_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652043] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.endpoint_id = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652211] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.endpoint_override = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652375] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.insecure = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652531] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.keyfile = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652688] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.max_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.652845] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.min_version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653007] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.region_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653170] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.service_name = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653326] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.service_type = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653502] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.split_loggers = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653678] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.status_code_retries = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653793] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.status_code_retry_delay = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.653948] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.timeout = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654117] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.valid_interfaces = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654277] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_limit.version = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654442] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_reports.file_event_handler = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654605] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654762] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] oslo_reports.log_dir = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.654956] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655138] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655301] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655467] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655627] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655784] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.655953] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656126] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.group = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656284] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656446] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656607] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656763] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] vif_plug_ovs_privileged.user = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.656965] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.flat_interface = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.657170] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.657349] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.657521] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.657693] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.657877] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658070] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658239] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658419] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.isolate_vif = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658587] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658751] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.658921] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659103] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.ovsdb_interface = native {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659269] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_vif_ovs.per_port_bridge = False {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659432] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_brick.lock_path = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659595] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659754] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.659922] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.capabilities = [21] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660095] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.group = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660254] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.helper_command = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660414] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660575] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660730] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] privsep_osbrick.user = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.660897] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661102] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.group = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661277] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.helper_command = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661443] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661603] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661758] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] nova_sys_admin.user = None {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 506.661888] env[60175]: DEBUG oslo_service.service [None req-214df058-2ca3-4f15-b980-34c511f4a40b None None] ******************************************************************************** {{(pid=60175) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 506.662326] env[60175]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 506.670684] env[60175]: INFO nova.virt.node [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Generated node identity 3984c8da-53ad-4889-8d1f-23bab60fa84e [ 506.670918] env[60175]: INFO nova.virt.node [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Wrote node identity 3984c8da-53ad-4889-8d1f-23bab60fa84e to /opt/stack/data/n-cpu-1/compute_id [ 506.681810] env[60175]: WARNING nova.compute.manager [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Compute nodes ['3984c8da-53ad-4889-8d1f-23bab60fa84e'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 506.712960] env[60175]: INFO nova.compute.manager [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 506.731884] env[60175]: WARNING nova.compute.manager [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 506.732121] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.732331] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.732499] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 506.732668] env[60175]: DEBUG nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 506.733740] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e9b3e01-0d46-4b97-b596-bc8049e55866 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.742447] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a983e41e-cc5c-4708-9bc3-bf550c3cfe06 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.756154] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4304a069-468f-4e50-af71-12a87ef009dd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.762178] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93d2ed74-6d35-4687-a8f4-c4ef5f5b55fc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.790815] env[60175]: DEBUG nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180704MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 506.790962] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.791147] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.803355] env[60175]: WARNING nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] No compute node record for cpu-1:3984c8da-53ad-4889-8d1f-23bab60fa84e: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 3984c8da-53ad-4889-8d1f-23bab60fa84e could not be found. [ 506.815603] env[60175]: INFO nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 3984c8da-53ad-4889-8d1f-23bab60fa84e [ 506.867315] env[60175]: DEBUG nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 506.867507] env[60175]: DEBUG nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=149GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 506.971039] env[60175]: INFO nova.scheduler.client.report [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] [req-6a5e5bfd-9802-415e-8175-6146068a4274] Created resource provider record via placement API for resource provider with UUID 3984c8da-53ad-4889-8d1f-23bab60fa84e and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 506.987442] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2837b19f-4574-4f31-8ede-bd231ccd3ea1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 506.994767] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-853b9d57-1490-473a-8ae6-dd614398bb69 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 507.024084] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01cac08b-a1e6-415c-97d2-cf2e70da8eba {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 507.030985] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8671eb22-6ae4-4062-ac6c-01df1c7740bf {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 507.043843] env[60175]: DEBUG nova.compute.provider_tree [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Updating inventory in ProviderTree for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 507.080216] env[60175]: DEBUG nova.scheduler.client.report [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Updated inventory for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 507.080436] env[60175]: DEBUG nova.compute.provider_tree [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Updating resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e generation from 0 to 1 during operation: update_inventory {{(pid=60175) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 507.080577] env[60175]: DEBUG nova.compute.provider_tree [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Updating inventory in ProviderTree for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 507.124231] env[60175]: DEBUG nova.compute.provider_tree [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Updating resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e generation from 1 to 2 during operation: update_traits {{(pid=60175) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 507.142137] env[60175]: DEBUG nova.compute.resource_tracker [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 507.142382] env[60175]: DEBUG oslo_concurrency.lockutils [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 507.142539] env[60175]: DEBUG nova.service [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Creating RPC server for service compute {{(pid=60175) start /opt/stack/nova/nova/service.py:182}} [ 507.155573] env[60175]: DEBUG nova.service [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] Join ServiceGroup membership for this service compute {{(pid=60175) start /opt/stack/nova/nova/service.py:199}} [ 507.155573] env[60175]: DEBUG nova.servicegroup.drivers.db [None req-0935081f-099a-4e69-81f5-bd3522b7f699 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60175) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 533.157074] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 533.168021] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Getting list of instances from cluster (obj){ [ 533.168021] env[60175]: value = "domain-c8" [ 533.168021] env[60175]: _type = "ClusterComputeResource" [ 533.168021] env[60175]: } {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 533.169379] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6296c1e-00b4-4b48-b029-ad99975508c4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.178861] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Got total of 0 instances {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 533.179093] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 533.179404] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Getting list of instances from cluster (obj){ [ 533.179404] env[60175]: value = "domain-c8" [ 533.179404] env[60175]: _type = "ClusterComputeResource" [ 533.179404] env[60175]: } {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 533.180262] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a19f607b-915e-42c9-b99f-60ea7c0d2a85 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.187536] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Got total of 0 instances {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 552.740082] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "dedec08e-95d1-4467-96a4-cdec5f170e01" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.740392] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "dedec08e-95d1-4467-96a4-cdec5f170e01" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.760047] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 552.861163] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 552.863105] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 552.863642] env[60175]: INFO nova.compute.claims [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.014896] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8698e5bb-d721-40a6-ab33-4b69cce97140 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.022405] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-383e1d96-614f-4bd3-b659-ada5f377cb4a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.063565] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f9a42da-0553-44f3-bca9-e2d507064a16 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.072476] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1caed7a1-5705-498f-ae75-b7533e980cab {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.089271] env[60175]: DEBUG nova.compute.provider_tree [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 553.100630] env[60175]: DEBUG nova.scheduler.client.report [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 553.124999] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.124999] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 553.171217] env[60175]: DEBUG nova.compute.utils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 553.176283] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Not allocating networking since 'none' was specified. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 553.179884] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "a0755f79-7df4-4660-92e6-5dd80af94aaa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.179884] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "a0755f79-7df4-4660-92e6-5dd80af94aaa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.191211] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 553.195573] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 553.274578] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.274674] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.276181] env[60175]: INFO nova.compute.claims [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.296051] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 553.422620] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "b5636e10-af08-49d3-a9b2-8122521a9e2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.422863] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "b5636e10-af08-49d3-a9b2-8122521a9e2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.440575] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 553.447149] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.447399] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.454277] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59e37da-68b6-47f1-b5f5-b02ed834c038 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.463725] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42279bc5-0608-4d19-a8fb-11db6311ec51 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.469075] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 553.502575] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be41f93a-04c1-48fb-ada3-4e02f3cd72fd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.511980] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75a92ef6-e0b3-4815-b4d9-c5b7796f12cd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.520569] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 553.520794] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 553.520942] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 553.521145] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 553.521284] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 553.521423] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 553.521642] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 553.521789] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 553.523894] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 553.524120] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 553.524536] env[60175]: DEBUG nova.virt.hardware [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 553.527511] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d709138f-a552-4f14-a729-ed99c2e93f07 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.552871] env[60175]: DEBUG nova.compute.provider_tree [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 553.554687] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.567045] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d08842f8-ea4e-4c0c-b922-2e29b9832c79 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.574183] env[60175]: DEBUG nova.scheduler.client.report [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 553.578482] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 553.591761] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f336abfe-71de-49c8-ac2c-fcaef77b3af2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.606661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.332s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.607175] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 553.612393] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.058s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.616010] env[60175]: INFO nova.compute.claims [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.624179] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Instance VIF info [] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 553.637231] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 553.638382] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5ed47149-169c-4790-8573-128041c922f0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.650849] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Created folder: OpenStack in parent group-v4. [ 553.650849] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating folder: Project (c11194ae93ad4f41bf4a8e89106dfa44). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 553.650981] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-310ea434-dea4-44ac-9d08-c04933e9dd9b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.663928] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Created folder: Project (c11194ae93ad4f41bf4a8e89106dfa44) in parent group-v845475. [ 553.664148] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating folder: Instances. Parent ref: group-v845476. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 553.664397] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c364ef4e-87aa-4c4f-8052-9a9a364ba7a5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.670925] env[60175]: DEBUG nova.compute.utils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 553.672224] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 553.675400] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 553.680148] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Created folder: Instances in parent group-v845476. [ 553.680588] env[60175]: DEBUG oslo.service.loopingcall [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 553.680833] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 553.681229] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a494036d-24a5-4768-8471-0d309a80b05e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.694982] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 553.702138] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 553.702138] env[60175]: value = "task-4292837" [ 553.702138] env[60175]: _type = "Task" [ 553.702138] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 553.718027] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292837, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 553.783328] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 553.788879] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e422cc2-b26b-47e2-9b58-e4f7853fa62e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.796846] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3a36a9-0802-494b-8cab-c4fd5bc6bdfa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.830949] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 553.830949] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 553.830949] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 553.831199] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 553.831650] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 553.831734] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 553.831916] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 553.832092] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 553.832402] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 553.832402] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 553.832577] env[60175]: DEBUG nova.virt.hardware [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 553.833900] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-450bdb16-2cef-4983-b911-1e901f411333 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.836895] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69640c81-9238-4d6f-a2f6-10daa5edf7a3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.847543] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc92f05e-9c69-4860-8965-d18e307d98b7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.854022] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3658ad4-4caa-487a-979c-0652b108f789 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 553.867352] env[60175]: DEBUG nova.compute.provider_tree [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 553.882018] env[60175]: DEBUG nova.scheduler.client.report [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 553.898041] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 553.898626] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 553.901864] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.323s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 553.903487] env[60175]: INFO nova.compute.claims [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.962970] env[60175]: DEBUG nova.policy [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0c1c3b48259043c68eb019adafc1a116', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd02a18280b9642539083abb609f328d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 553.970919] env[60175]: DEBUG nova.compute.utils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 553.974026] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 553.974026] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 553.986597] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 554.075428] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9316701-3e2f-43d6-b6b9-cdc7b9ea5351 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.080867] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 554.087258] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee56722-87bf-471a-b467-ccce44ea2c1b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.130224] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6356311-4f83-443f-ae70-0349cbd2ff25 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.140892] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273ff4ec-2459-4c25-9b5e-a95fc7966996 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.147257] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.147473] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.147626] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.147834] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.147982] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.148129] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.148329] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.148477] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.148632] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.148814] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.148987] env[60175]: DEBUG nova.virt.hardware [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.151770] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe7dffc0-1aec-4165-9c77-93dfeb2eed3a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.166826] env[60175]: DEBUG nova.compute.provider_tree [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 554.171892] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caaba562-2125-4720-9770-2750b6eaa1b3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.190022] env[60175]: DEBUG nova.scheduler.client.report [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 554.213240] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.213240] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 554.221219] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292837, 'name': CreateVM_Task, 'duration_secs': 0.364979} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 554.221367] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 554.222391] env[60175]: DEBUG oslo_vmware.service [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75846a37-df20-4448-afd6-9f8fe194344b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.228306] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 554.228460] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 554.229119] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 554.229355] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-020c6af9-b538-4b15-b5b5-69182ca318e5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.234328] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for the task: (returnval){ [ 554.234328] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ccdd3b-c0cf-dd6f-dfd3-0415ff747157" [ 554.234328] env[60175]: _type = "Task" [ 554.234328] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 554.242037] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ccdd3b-c0cf-dd6f-dfd3-0415ff747157, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 554.243522] env[60175]: DEBUG nova.policy [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63fafad766aa42758af1a36008299adb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '72271ead0a4a44da9b2f69a5062734e2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 554.269807] env[60175]: DEBUG nova.compute.utils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 554.271665] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 554.271665] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 554.292920] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 554.381359] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 554.407983] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.409841] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.424455] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.424783] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.424854] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.425045] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.425186] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.425414] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.425996] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.425996] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.425996] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.425996] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.426680] env[60175]: DEBUG nova.virt.hardware [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.427342] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f4a6fe7-616e-4faa-a767-82c363e13f43 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.435912] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 554.443577] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abdeaa33-f4d9-47b2-9e4f-0e24e564d099 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.485098] env[60175]: DEBUG nova.policy [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '51ecb9d39b5e4829a1d68198cb05e5a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '502b35c7d9b44881ac0c5052e7783f3d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 554.501602] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.501929] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.503437] env[60175]: INFO nova.compute.claims [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 554.655977] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d88e7e-4f00-42e6-8479-b58c7d14f089 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.665785] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b46aa3de-6291-49ff-bfa9-eeb55e445301 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.701780] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e7e3dc7-2e5c-441f-9e8c-2e1f65d704cc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.714905] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d2ae133-9511-4d76-9109-12d7d2caf70a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.735615] env[60175]: DEBUG nova.compute.provider_tree [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 554.748596] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 554.748596] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 554.748596] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 554.748596] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 554.748759] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 554.751937] env[60175]: DEBUG nova.scheduler.client.report [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 554.752896] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10217abc-be08-420c-aae3-a6d09bcf46b1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.766214] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.766723] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 554.773339] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 554.773451] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 554.774236] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2576e2a6-7404-4b05-985a-600e0790cf92 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.784874] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d5495c94-9bec-4010-bd7a-5708123b56af {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.793410] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for the task: (returnval){ [ 554.793410] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]529e2c4a-4fee-d4c1-4b9e-722797299c16" [ 554.793410] env[60175]: _type = "Task" [ 554.793410] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 554.804251] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]529e2c4a-4fee-d4c1-4b9e-722797299c16, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 554.817123] env[60175]: DEBUG nova.compute.utils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 554.818517] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 554.818990] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 554.833692] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 554.913221] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 554.944108] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.944400] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.945174] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.945174] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.946141] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.946141] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.946141] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.946141] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.946141] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.946414] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.950500] env[60175]: DEBUG nova.virt.hardware [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.950599] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-596b2248-8269-4265-adee-9f460d27046f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.959890] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bd37fef-4811-4523-acaa-ad3880aa0194 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.109416] env[60175]: DEBUG nova.policy [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b037cf8807344a588fb6691968546879', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a2116357188240649088d46f3a6c3c50', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 555.306191] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 555.306508] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating directory with path [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 555.308018] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8e193482-56aa-4d9e-b47e-92c9615a7921 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.328468] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Created directory with path [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 555.328745] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Fetch image to [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 555.329141] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 555.329878] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d5c354-ddb5-42ba-ae53-cb5def014491 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.344408] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08b18efe-83b7-4234-91a1-a8e2886a27af {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.356425] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78de0f09-0b26-4785-a1fd-52dc3c58b51f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.398061] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df867717-a284-4b84-bcfd-b46bfa3a5978 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.408922] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-686dca29-de0b-4e1f-9331-5381440ef578 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.498511] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 555.525553] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "72caf1e5-e894-4581-a95d-21dda85e11b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.526614] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.539832] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 555.570710] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Successfully created port: d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 555.586302] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 555.656830] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Successfully created port: e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 555.661997] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 555.666389] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 555.678633] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.678915] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.680420] env[60175]: INFO nova.compute.claims [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 555.853909] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df44ff3-efb3-4cb3-ae1b-811fbcc25690 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.863198] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc7d5f91-86a3-4824-8070-69e72f534081 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.869477] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Successfully created port: f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 555.897815] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610e617a-e27a-492f-aa30-dcb3bb43ebbc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.908069] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc50f789-3c37-4488-8395-7980589a82b1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.925277] env[60175]: DEBUG nova.compute.provider_tree [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 555.930640] env[60175]: DEBUG nova.scheduler.client.report [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 555.948631] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.949130] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 555.993728] env[60175]: DEBUG nova.compute.utils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 555.995744] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 555.996548] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 556.011574] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 556.103109] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 556.132471] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 556.132717] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 556.132875] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 556.133078] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 556.133238] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 556.133395] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 556.133615] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 556.133782] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 556.133959] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 556.134673] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 556.134945] env[60175]: DEBUG nova.virt.hardware [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 556.135759] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e032e264-2fe0-47e0-b1ec-8a5191825895 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.145398] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-324fc5a3-4547-4317-a012-0a0e9fc6d408 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.261534] env[60175]: DEBUG nova.policy [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef369e94b11e4ec987e2455f4232d947', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce403fd7ca154238a6c92f219ddf95fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 557.244893] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Successfully created port: 8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 557.267131] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Successfully created port: 090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 558.261456] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Successfully updated port: d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.274076] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 558.274076] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquired lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 558.274076] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 558.395384] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Successfully updated port: f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.407199] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 558.407199] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquired lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 558.407199] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 558.417789] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.552966] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.936650] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Successfully updated port: e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.950890] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 558.951286] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquired lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 558.951286] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 559.107616] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.108030] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.118135] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 559.170733] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.171255] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.176229] env[60175]: INFO nova.compute.claims [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 559.272467] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 559.352916] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa4b218f-c621-464f-84a0-313c016b01ea {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.364063] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-386beafe-d6db-462d-8d78-41ac075d8060 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.399518] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e737350c-f7d2-47c6-a2dd-6acc9f3dd4b9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.404414] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Updating instance_info_cache with network_info: [{"id": "d8e06d4a-934f-4b21-9e63-2f767deba066", "address": "fa:16:3e:63:be:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8e06d4a-93", "ovs_interfaceid": "d8e06d4a-934f-4b21-9e63-2f767deba066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 559.411423] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55c45fa8-e906-4deb-aee5-f07ed2d8ce45 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.417980] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Releasing lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 559.418296] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance network_info: |[{"id": "d8e06d4a-934f-4b21-9e63-2f767deba066", "address": "fa:16:3e:63:be:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8e06d4a-93", "ovs_interfaceid": "d8e06d4a-934f-4b21-9e63-2f767deba066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 559.427109] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:63:be:47', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd8e06d4a-934f-4b21-9e63-2f767deba066', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 559.435137] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Creating folder: Project (d02a18280b9642539083abb609f328d5). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.436663] env[60175]: DEBUG nova.compute.provider_tree [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 559.437085] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-74ae0543-514e-4496-b2f6-9665ebb4c899 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.444541] env[60175]: DEBUG nova.scheduler.client.report [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 559.451569] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Created folder: Project (d02a18280b9642539083abb609f328d5) in parent group-v845475. [ 559.451863] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Creating folder: Instances. Parent ref: group-v845479. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.451969] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eebaa559-fb33-48dc-9e13-49798f6b9c07 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.459984] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 559.460566] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 559.464516] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Created folder: Instances in parent group-v845479. [ 559.464516] env[60175]: DEBUG oslo.service.loopingcall [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 559.464782] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.465011] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a86e5b8a-ec5e-44d3-a133-7d19a7a25d13 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.480330] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Updating instance_info_cache with network_info: [{"id": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "address": "fa:16:3e:e1:fb:eb", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3d03414-4f", "ovs_interfaceid": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 559.486397] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.486397] env[60175]: value = "task-4292840" [ 559.486397] env[60175]: _type = "Task" [ 559.486397] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 559.495096] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292840, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 559.498767] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Releasing lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 559.499613] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance network_info: |[{"id": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "address": "fa:16:3e:e1:fb:eb", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3d03414-4f", "ovs_interfaceid": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 559.499742] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e1:fb:eb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f3d03414-4f14-4dc8-b894-fed370ba6abe', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 559.507417] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Creating folder: Project (502b35c7d9b44881ac0c5052e7783f3d). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.510575] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e57ad40-78d4-4e06-9ef1-ee4b51977a71 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.511258] env[60175]: DEBUG nova.compute.utils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 559.512817] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 559.512989] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 559.521644] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 559.528053] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Created folder: Project (502b35c7d9b44881ac0c5052e7783f3d) in parent group-v845475. [ 559.529581] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Creating folder: Instances. Parent ref: group-v845482. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.529581] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-81731b26-24cd-4cee-8f83-67f0e1bc558a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.541056] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Created folder: Instances in parent group-v845482. [ 559.541056] env[60175]: DEBUG oslo.service.loopingcall [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 559.541713] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.541896] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-08fe667a-7ba4-4d6e-8454-f0c37b7c82e0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.570438] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.570438] env[60175]: value = "task-4292843" [ 559.570438] env[60175]: _type = "Task" [ 559.570438] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 559.583276] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292843, 'name': CreateVM_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 559.614345] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 559.645404] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 559.645668] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 559.645995] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 559.646081] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 559.646235] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 559.646385] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 559.646597] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 559.646754] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 559.647171] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 559.647351] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 559.647928] env[60175]: DEBUG nova.virt.hardware [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 559.648443] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83bb46b0-892c-4836-9665-28a1f09b5fd9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.660070] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b67a87f7-b5b4-4eb1-8538-10a36caee6a7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 559.740258] env[60175]: DEBUG nova.policy [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4930bb1ad0cc4376a388847b3238dded', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed721a0a42ee43fba6f37868594bffec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 559.996827] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292840, 'name': CreateVM_Task, 'duration_secs': 0.3192} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 559.996996] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 560.035074] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 560.035074] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 560.035074] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 560.035363] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b48daf0-ba84-4c7c-96e1-0c2347178ecb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 560.042454] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Waiting for the task: (returnval){ [ 560.042454] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ef7487-3020-3103-b006-35e6c559395f" [ 560.042454] env[60175]: _type = "Task" [ 560.042454] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 560.050357] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ef7487-3020-3103-b006-35e6c559395f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.080964] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292843, 'name': CreateVM_Task, 'duration_secs': 0.315192} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 560.080964] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 560.081527] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 560.145815] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Successfully updated port: 090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 560.146859] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Updating instance_info_cache with network_info: [{"id": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "address": "fa:16:3e:05:55:d2", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape74e7f64-16", "ovs_interfaceid": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 560.156888] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 560.157046] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquired lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 560.157200] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 560.168928] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Releasing lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 560.170028] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance network_info: |[{"id": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "address": "fa:16:3e:05:55:d2", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape74e7f64-16", "ovs_interfaceid": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 560.172795] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:05:55:d2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e74e7f64-1640-4948-8af7-86b8d9f2542e', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 560.182095] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Creating folder: Project (72271ead0a4a44da9b2f69a5062734e2). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 560.183597] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2da00bc4-e762-4c20-bbe1-27cb5d0f6b7b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 560.195117] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Created folder: Project (72271ead0a4a44da9b2f69a5062734e2) in parent group-v845475. [ 560.195337] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Creating folder: Instances. Parent ref: group-v845485. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 560.195883] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf40ebe1-74da-4165-98d2-b6fff17df3a9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 560.207671] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Created folder: Instances in parent group-v845485. [ 560.208067] env[60175]: DEBUG oslo.service.loopingcall [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 560.208176] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 560.208383] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7f19cb13-a88e-4acc-a580-b2ca6dd878dd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 560.229839] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 560.229839] env[60175]: value = "task-4292846" [ 560.229839] env[60175]: _type = "Task" [ 560.229839] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 560.239884] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292846, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.372317] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 560.399636] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Successfully updated port: 8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 560.409046] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 560.409388] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquired lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 560.409388] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 560.553756] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 560.554063] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 560.554300] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 560.554535] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 560.554836] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 560.555103] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f4a3d8a-8b5c-4956-a036-1983b1bc29aa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 560.560209] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Waiting for the task: (returnval){ [ 560.560209] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e2af08-5b65-51cc-1e00-17cd173f949b" [ 560.560209] env[60175]: _type = "Task" [ 560.560209] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 560.569416] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e2af08-5b65-51cc-1e00-17cd173f949b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.742607] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292846, 'name': CreateVM_Task} progress is 99%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.744885] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 560.971725] env[60175]: DEBUG nova.compute.manager [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Received event network-vif-plugged-d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 560.973576] env[60175]: DEBUG oslo_concurrency.lockutils [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] Acquiring lock "a0755f79-7df4-4660-92e6-5dd80af94aaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 560.973576] env[60175]: DEBUG oslo_concurrency.lockutils [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] Lock "a0755f79-7df4-4660-92e6-5dd80af94aaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 560.973576] env[60175]: DEBUG oslo_concurrency.lockutils [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] Lock "a0755f79-7df4-4660-92e6-5dd80af94aaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 560.973576] env[60175]: DEBUG nova.compute.manager [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] No waiting events found dispatching network-vif-plugged-d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 560.973931] env[60175]: WARNING nova.compute.manager [req-e1395a3e-3dab-4ad1-98cc-1deca30fb30d req-8b1d6ded-5260-4eff-987b-07e7b22db6c7 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Received unexpected event network-vif-plugged-d8e06d4a-934f-4b21-9e63-2f767deba066 for instance with vm_state building and task_state spawning. [ 561.070758] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.071187] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 561.071266] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.239385] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updating instance_info_cache with network_info: [{"id": "090c1147-249c-422b-914d-47a8a7fb841b", "address": "fa:16:3e:31:23:e4", "network": {"id": "11e6485a-ab1e-4734-a691-e5f89baa8688", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1693045867-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce403fd7ca154238a6c92f219ddf95fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap090c1147-24", "ovs_interfaceid": "090c1147-249c-422b-914d-47a8a7fb841b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 561.249068] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292846, 'name': CreateVM_Task} progress is 99%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.254920] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Releasing lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.255271] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance network_info: |[{"id": "090c1147-249c-422b-914d-47a8a7fb841b", "address": "fa:16:3e:31:23:e4", "network": {"id": "11e6485a-ab1e-4734-a691-e5f89baa8688", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1693045867-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce403fd7ca154238a6c92f219ddf95fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap090c1147-24", "ovs_interfaceid": "090c1147-249c-422b-914d-47a8a7fb841b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 561.255637] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:23:e4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '510d3c47-3615-43d5-aa5d-a279fd915e71', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '090c1147-249c-422b-914d-47a8a7fb841b', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 561.264256] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Creating folder: Project (ce403fd7ca154238a6c92f219ddf95fc). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.264936] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bc84e4da-a9ed-4a23-861a-3ce3f6b0812d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.276239] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Created folder: Project (ce403fd7ca154238a6c92f219ddf95fc) in parent group-v845475. [ 561.276422] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Creating folder: Instances. Parent ref: group-v845488. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.276662] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52a7202d-b87a-4350-92ad-bf6634cbddbc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.286236] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Created folder: Instances in parent group-v845488. [ 561.286498] env[60175]: DEBUG oslo.service.loopingcall [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 561.286695] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 561.286899] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-31f2d3bd-f5f6-4c37-ac23-e93ca827460e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.310342] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updating instance_info_cache with network_info: [{"id": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "address": "fa:16:3e:93:53:ef", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dcb7ce7-6d", "ovs_interfaceid": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 561.320307] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 561.320307] env[60175]: value = "task-4292849" [ 561.320307] env[60175]: _type = "Task" [ 561.320307] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.329726] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292849, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.332394] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Releasing lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.333038] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance network_info: |[{"id": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "address": "fa:16:3e:93:53:ef", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dcb7ce7-6d", "ovs_interfaceid": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 561.333219] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:93:53:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8dcb7ce7-6d69-476a-a3c3-9988b72d22ec', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 561.340812] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Creating folder: Project (a2116357188240649088d46f3a6c3c50). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.341666] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85f02c22-3940-48de-9d55-c517ef6d188b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.353624] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Created folder: Project (a2116357188240649088d46f3a6c3c50) in parent group-v845475. [ 561.354432] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Creating folder: Instances. Parent ref: group-v845491. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.354432] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c87989d1-1080-4755-9c5f-60e34eaf69eb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.364760] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Created folder: Instances in parent group-v845491. [ 561.366140] env[60175]: DEBUG oslo.service.loopingcall [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 561.366368] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 561.366624] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bf9fbe44-c3df-4a63-8f40-44003b6246de {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.390248] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 561.390248] env[60175]: value = "task-4292852" [ 561.390248] env[60175]: _type = "Task" [ 561.390248] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.397922] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292852, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.571193] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Successfully created port: cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 561.599431] env[60175]: DEBUG nova.compute.manager [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Received event network-vif-plugged-e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 561.599888] env[60175]: DEBUG oslo_concurrency.lockutils [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] Acquiring lock "b5636e10-af08-49d3-a9b2-8122521a9e2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.600941] env[60175]: DEBUG oslo_concurrency.lockutils [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] Lock "b5636e10-af08-49d3-a9b2-8122521a9e2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.600941] env[60175]: DEBUG oslo_concurrency.lockutils [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] Lock "b5636e10-af08-49d3-a9b2-8122521a9e2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.600941] env[60175]: DEBUG nova.compute.manager [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] No waiting events found dispatching network-vif-plugged-e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 561.600941] env[60175]: WARNING nova.compute.manager [req-d8c36fb8-70d0-43d4-afbe-3a93f5de0127 req-b64fc7e9-eba0-4040-af57-83a9a42b4a0b service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Received unexpected event network-vif-plugged-e74e7f64-1640-4948-8af7-86b8d9f2542e for instance with vm_state building and task_state spawning. [ 561.746929] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292846, 'name': CreateVM_Task, 'duration_secs': 1.353258} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 561.748235] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 561.749119] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.749290] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.749615] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 561.749872] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b99c7125-5bdf-4d36-812e-4b7c3c325070 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.757070] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Waiting for the task: (returnval){ [ 561.757070] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52d9daa4-2fb2-b949-a406-7ff92e32c6a6" [ 561.757070] env[60175]: _type = "Task" [ 561.757070] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.769194] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52d9daa4-2fb2-b949-a406-7ff92e32c6a6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.835730] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292849, 'name': CreateVM_Task, 'duration_secs': 0.454086} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 561.835958] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 561.836772] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.900847] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292852, 'name': CreateVM_Task, 'duration_secs': 0.434206} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 561.902368] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 561.902368] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.277851] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.278162] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 562.278371] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.278577] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.278922] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 562.279325] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c0f5be59-847e-4f12-ab43-3c2e5793b23f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.287170] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for the task: (returnval){ [ 562.287170] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52a427d8-8079-faf9-2477-3fac078b6c3b" [ 562.287170] env[60175]: _type = "Task" [ 562.287170] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 562.300431] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52a427d8-8079-faf9-2477-3fac078b6c3b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 562.509827] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "da3eaeea-ce26-40eb-af8b-8857f927e431" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.510311] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.522970] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 562.604032] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.604032] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.604032] env[60175]: INFO nova.compute.claims [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 562.802496] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.802496] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 562.802496] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.802664] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.802953] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 562.803212] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f916c262-eab6-4311-8275-6146e9fc91fb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.812356] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for the task: (returnval){ [ 562.812356] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526580c9-dd5d-ae17-41ba-9232f4a4a593" [ 562.812356] env[60175]: _type = "Task" [ 562.812356] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 562.831025] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526580c9-dd5d-ae17-41ba-9232f4a4a593, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 562.958675] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.959134] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.959367] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 562.959519] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 562.970045] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-175196fc-d2d0-4e82-bc04-cc116a77c407 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.982140] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a038ce3d-b0aa-49cf-a94e-62ca9f1d4716 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.987660] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.987762] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.987910] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.988452] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.988452] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.988452] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.988963] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.988963] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 562.989057] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 562.989874] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.990750] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.990750] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.990750] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.990750] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.990938] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 562.991137] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 562.991280] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 563.020376] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.021137] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb84eaaa-6858-4a13-81f9-b62b2018097f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.029203] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d19b4148-a1e1-4189-a155-4b864dff15e8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.044767] env[60175]: DEBUG nova.compute.provider_tree [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 563.052860] env[60175]: DEBUG nova.scheduler.client.report [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 563.065891] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.464s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.066313] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 563.069350] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.049s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.069350] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.069350] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 563.070294] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d71c67-50a6-43de-aa00-625d8f0a4650 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.078121] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbbc54a7-7f75-44a6-a47f-9dbf670a587d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.094853] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fbcd34b-9f2b-473b-b7a7-8cbf934e707e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.098770] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c100c1ab-d3a6-4a46-a965-db6ddfc02a71 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.104972] env[60175]: DEBUG nova.compute.utils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 563.131818] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 563.131818] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 563.133455] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180691MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 563.133599] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.133782] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.135766] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 563.203653] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance dedec08e-95d1-4467-96a4-cdec5f170e01 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.203745] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance a0755f79-7df4-4660-92e6-5dd80af94aaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.203792] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance b5636e10-af08-49d3-a9b2-8122521a9e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.203918] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.204066] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.204192] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 72caf1e5-e894-4581-a95d-21dda85e11b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.204341] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.204419] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 563.204774] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 563.204774] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=149GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 563.239647] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 563.271304] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 563.271681] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 563.271904] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 563.272154] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 563.272342] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 563.272521] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 563.272763] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 563.272957] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 563.273170] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 563.273500] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 563.273729] env[60175]: DEBUG nova.virt.hardware [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 563.274694] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e57cbeb9-f44f-4131-8a4e-c1cac95c80dc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.286732] env[60175]: DEBUG nova.policy [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3736edde826e4b4cabc61b17c223ace6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c984e18184364cb6a11bd2014bc377b3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 563.293447] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a244ccd-47e5-421f-9ded-8ebaa5e8e755 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.326094] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.326628] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 563.326926] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.356726] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-569bb30b-e00b-4735-a426-250459fbf17f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.363898] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fbf5447-2e67-438b-ab46-00c8eaf54a6c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.400326] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc419908-20e4-43e9-a62b-8623b4818780 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.408291] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ced0973e-96e0-4a1b-b221-3c2781724ca1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.427331] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 563.439155] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 563.457420] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 563.457621] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.650606] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Successfully updated port: cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 563.670382] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.670540] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 563.670693] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 563.731030] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 563.991347] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updating instance_info_cache with network_info: [{"id": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "address": "fa:16:3e:ac:ac:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcef8131b-f2", "ovs_interfaceid": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.997580] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.997724] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.006361] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 564.006998] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance network_info: |[{"id": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "address": "fa:16:3e:ac:ac:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcef8131b-f2", "ovs_interfaceid": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 564.007125] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ac:ac:47', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cef8131b-f245-4c5c-b33c-c3ccffa404fb', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 564.015590] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating folder: Project (ed721a0a42ee43fba6f37868594bffec). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.016788] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7fb4a94-4726-4e66-8fdd-0fc7ca10a61c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.019097] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 564.034855] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created folder: Project (ed721a0a42ee43fba6f37868594bffec) in parent group-v845475. [ 564.035122] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating folder: Instances. Parent ref: group-v845494. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.035404] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b300d067-bf72-4591-b416-6dbfa11c2608 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.049688] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created folder: Instances in parent group-v845494. [ 564.049688] env[60175]: DEBUG oslo.service.loopingcall [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 564.049688] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 564.049838] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e3a3888d-5c9e-4335-8f01-2ac2d2a550e6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.074175] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 564.074175] env[60175]: value = "task-4292855" [ 564.074175] env[60175]: _type = "Task" [ 564.074175] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.088603] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292855, 'name': CreateVM_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.092039] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.092039] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.092818] env[60175]: INFO nova.compute.claims [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 564.349706] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d67a815-c1d8-4c53-8cf8-61ca3e6f81c6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.361223] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c1ae17d-cfe9-497a-b744-9c19adb56c7e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.404235] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e8bcf1a-dc95-42ff-9410-f91722798c67 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.413590] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9939bc65-fa80-437d-9879-0841d54f48e4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.428781] env[60175]: DEBUG nova.compute.provider_tree [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 564.443268] env[60175]: DEBUG nova.scheduler.client.report [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 564.461219] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.369s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.461219] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 564.508805] env[60175]: DEBUG nova.compute.utils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 564.508936] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 564.509121] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 564.517803] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 564.591555] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292855, 'name': CreateVM_Task, 'duration_secs': 0.314105} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 564.591883] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 564.592794] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 564.593078] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 564.593679] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 564.594314] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1742a795-2ade-4fe3-a528-c8d15810e388 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.599190] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 564.599190] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52b7b54e-b80c-6a7b-7acc-6f3add9c1e29" [ 564.599190] env[60175]: _type = "Task" [ 564.599190] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.608193] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52b7b54e-b80c-6a7b-7acc-6f3add9c1e29, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.615966] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 564.637376] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 564.637599] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 564.637599] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 564.638183] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 564.638398] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 564.638582] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 564.638837] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 564.639040] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 564.639251] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 564.639451] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 564.639747] env[60175]: DEBUG nova.virt.hardware [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 564.640554] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8d378c8-67a5-410b-9ac7-713a9074272a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.648778] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4b5b402-fcc2-485e-818d-1595366b2ad8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.655035] env[60175]: DEBUG nova.policy [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '36d95e8e77d74f019725115e00d59093', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f9d7fe46ca145d3983ce03907f5842c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 565.034596] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Successfully created port: 2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 565.108705] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Received event network-vif-plugged-f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 565.109029] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquiring lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 565.109276] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 565.109455] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 565.109622] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] No waiting events found dispatching network-vif-plugged-f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 565.109784] env[60175]: WARNING nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Received unexpected event network-vif-plugged-f3d03414-4f14-4dc8-b894-fed370ba6abe for instance with vm_state building and task_state spawning. [ 565.109958] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Received event network-changed-d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 565.110134] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Refreshing instance network info cache due to event network-changed-d8e06d4a-934f-4b21-9e63-2f767deba066. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 565.110319] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquiring lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.110456] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquired lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 565.110610] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Refreshing network info cache for port d8e06d4a-934f-4b21-9e63-2f767deba066 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 565.115673] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 565.115958] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 565.116114] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.529683] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Successfully created port: 1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 565.576709] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Received event network-changed-e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 565.577835] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Refreshing instance network info cache due to event network-changed-e74e7f64-1640-4948-8af7-86b8d9f2542e. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 565.577835] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Acquiring lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.577835] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Acquired lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 565.577835] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Refreshing network info cache for port e74e7f64-1640-4948-8af7-86b8d9f2542e {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 566.243554] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Updated VIF entry in instance network info cache for port e74e7f64-1640-4948-8af7-86b8d9f2542e. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 566.243554] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Updating instance_info_cache with network_info: [{"id": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "address": "fa:16:3e:05:55:d2", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.180", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape74e7f64-16", "ovs_interfaceid": "e74e7f64-1640-4948-8af7-86b8d9f2542e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 566.252432] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Releasing lock "refresh_cache-b5636e10-af08-49d3-a9b2-8122521a9e2c" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 566.254195] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Received event network-vif-plugged-8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.254195] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Acquiring lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.254195] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.254195] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.254627] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] No waiting events found dispatching network-vif-plugged-8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 566.256023] env[60175]: WARNING nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Received unexpected event network-vif-plugged-8dcb7ce7-6d69-476a-a3c3-9988b72d22ec for instance with vm_state building and task_state spawning. [ 566.256023] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Received event network-changed-8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.256023] env[60175]: DEBUG nova.compute.manager [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Refreshing instance network info cache due to event network-changed-8dcb7ce7-6d69-476a-a3c3-9988b72d22ec. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 566.256023] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Acquiring lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.256023] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Acquired lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 566.256588] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Refreshing network info cache for port 8dcb7ce7-6d69-476a-a3c3-9988b72d22ec {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 566.606331] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Updated VIF entry in instance network info cache for port d8e06d4a-934f-4b21-9e63-2f767deba066. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 566.607057] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Updating instance_info_cache with network_info: [{"id": "d8e06d4a-934f-4b21-9e63-2f767deba066", "address": "fa:16:3e:63:be:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.168", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8e06d4a-93", "ovs_interfaceid": "d8e06d4a-934f-4b21-9e63-2f767deba066", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 566.618484] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Releasing lock "refresh_cache-a0755f79-7df4-4660-92e6-5dd80af94aaa" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 566.619051] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Received event network-changed-f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.619364] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Refreshing instance network info cache due to event network-changed-f3d03414-4f14-4dc8-b894-fed370ba6abe. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 566.619698] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquiring lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.619948] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquired lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 566.620448] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Refreshing network info cache for port f3d03414-4f14-4dc8-b894-fed370ba6abe {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 566.992830] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updated VIF entry in instance network info cache for port 8dcb7ce7-6d69-476a-a3c3-9988b72d22ec. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 566.993187] env[60175]: DEBUG nova.network.neutron [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updating instance_info_cache with network_info: [{"id": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "address": "fa:16:3e:93:53:ef", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.68", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8dcb7ce7-6d", "ovs_interfaceid": "8dcb7ce7-6d69-476a-a3c3-9988b72d22ec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 567.012150] env[60175]: DEBUG oslo_concurrency.lockutils [req-566416f0-4aa8-4500-9e29-c3b4d0c8b911 req-dd7cce81-3b81-43c7-ba47-db139ba875ba service nova] Releasing lock "refresh_cache-3107f9c0-9a35-424c-9fa3-d60057b9ceec" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 567.076260] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Successfully updated port: 1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 567.092419] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 567.093616] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquired lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 567.093616] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 567.170074] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 567.609195] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updating instance_info_cache with network_info: [{"id": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "address": "fa:16:3e:23:36:9c", "network": {"id": "23363b1d-446a-4887-9bbc-9e005797e022", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-529867696-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f9d7fe46ca145d3983ce03907f5842c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1df8b4df-6a", "ovs_interfaceid": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 567.628504] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Releasing lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 567.628817] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance network_info: |[{"id": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "address": "fa:16:3e:23:36:9c", "network": {"id": "23363b1d-446a-4887-9bbc-9e005797e022", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-529867696-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f9d7fe46ca145d3983ce03907f5842c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1df8b4df-6a", "ovs_interfaceid": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 567.629141] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:36:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a8c8175-1197-4f12-baac-ef6aba95f585', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1df8b4df-6a45-4701-872a-25d8bdbb26c7', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 567.641415] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Creating folder: Project (0f9d7fe46ca145d3983ce03907f5842c). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.642694] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9c2f6c2a-cc52-49c9-adc8-d5c1d43d8f36 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.653077] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Created folder: Project (0f9d7fe46ca145d3983ce03907f5842c) in parent group-v845475. [ 567.653314] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Creating folder: Instances. Parent ref: group-v845497. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.654129] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-083262f3-cd1e-4aa0-8fca-a76d9a2145e7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.665254] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Created folder: Instances in parent group-v845497. [ 567.665594] env[60175]: DEBUG oslo.service.loopingcall [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 567.665842] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 567.666057] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-556f3523-6809-470f-9455-c40b0647c79b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.687118] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 567.687118] env[60175]: value = "task-4292858" [ 567.687118] env[60175]: _type = "Task" [ 567.687118] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 567.695700] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292858, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 567.734211] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Updated VIF entry in instance network info cache for port f3d03414-4f14-4dc8-b894-fed370ba6abe. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 567.734211] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Updating instance_info_cache with network_info: [{"id": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "address": "fa:16:3e:e1:fb:eb", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.162", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf3d03414-4f", "ovs_interfaceid": "f3d03414-4f14-4dc8-b894-fed370ba6abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 567.748192] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Releasing lock "refresh_cache-99d97004-9f23-48ee-a88b-75fdb6acc4b8" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 567.748459] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Received event network-vif-plugged-090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 567.749746] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquiring lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 567.749746] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 567.749746] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 567.749746] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] No waiting events found dispatching network-vif-plugged-090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 567.750014] env[60175]: WARNING nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Received unexpected event network-vif-plugged-090c1147-249c-422b-914d-47a8a7fb841b for instance with vm_state building and task_state spawning. [ 567.750014] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Received event network-changed-090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 567.750014] env[60175]: DEBUG nova.compute.manager [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Refreshing instance network info cache due to event network-changed-090c1147-249c-422b-914d-47a8a7fb841b. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 567.754261] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquiring lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 567.756402] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Acquired lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 567.756402] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Refreshing network info cache for port 090c1147-249c-422b-914d-47a8a7fb841b {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 568.129287] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Successfully updated port: 2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 568.140253] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 568.140253] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquired lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 568.140253] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 568.199227] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292858, 'name': CreateVM_Task, 'duration_secs': 0.295051} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 568.199416] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 568.200090] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 568.200256] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 568.200550] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 568.200799] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3d6f00b-1a83-41c5-9cc4-c2b2913f1f18 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.208029] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for the task: (returnval){ [ 568.208029] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52f4f5d7-0538-9cd9-7582-ebe344ccec80" [ 568.208029] env[60175]: _type = "Task" [ 568.208029] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 568.219387] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52f4f5d7-0538-9cd9-7582-ebe344ccec80, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 568.321232] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 568.721311] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 568.722034] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 568.722455] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 569.022418] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updated VIF entry in instance network info cache for port 090c1147-249c-422b-914d-47a8a7fb841b. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 569.022945] env[60175]: DEBUG nova.network.neutron [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updating instance_info_cache with network_info: [{"id": "090c1147-249c-422b-914d-47a8a7fb841b", "address": "fa:16:3e:31:23:e4", "network": {"id": "11e6485a-ab1e-4734-a691-e5f89baa8688", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1693045867-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce403fd7ca154238a6c92f219ddf95fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap090c1147-24", "ovs_interfaceid": "090c1147-249c-422b-914d-47a8a7fb841b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 569.037923] env[60175]: DEBUG oslo_concurrency.lockutils [req-050acf22-08c6-422c-8e5b-7916d0dcdb5f req-3af0237d-93fd-4126-9632-049f01ff0d62 service nova] Releasing lock "refresh_cache-72caf1e5-e894-4581-a95d-21dda85e11b0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 569.170646] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updating instance_info_cache with network_info: [{"id": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "address": "fa:16:3e:16:91:e9", "network": {"id": "40c36353-2cdb-4f60-a41c-ff342ffb24c4", "bridge": "br-int", "label": "tempest-ServersTestJSON-861103854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c984e18184364cb6a11bd2014bc377b3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b078cf4-95", "ovs_interfaceid": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 569.185039] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Releasing lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 569.185400] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance network_info: |[{"id": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "address": "fa:16:3e:16:91:e9", "network": {"id": "40c36353-2cdb-4f60-a41c-ff342ffb24c4", "bridge": "br-int", "label": "tempest-ServersTestJSON-861103854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c984e18184364cb6a11bd2014bc377b3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b078cf4-95", "ovs_interfaceid": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 569.185715] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:91:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2b078cf4-954e-4719-8e2e-3569c1ecf656', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 569.196910] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Creating folder: Project (c984e18184364cb6a11bd2014bc377b3). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.197614] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ccd8efde-b1f6-4c21-b8ce-26b818698e3b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.214520] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Created folder: Project (c984e18184364cb6a11bd2014bc377b3) in parent group-v845475. [ 569.215549] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Creating folder: Instances. Parent ref: group-v845500. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.215549] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aec00b71-52ae-434a-aad1-77df457cd997 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.230856] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Created folder: Instances in parent group-v845500. [ 569.231132] env[60175]: DEBUG oslo.service.loopingcall [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 569.231335] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 569.232166] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ff2f20e2-329a-4ace-9e61-02b669085280 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.256770] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 569.256770] env[60175]: value = "task-4292861" [ 569.256770] env[60175]: _type = "Task" [ 569.256770] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 569.266639] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292861, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 569.767918] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292861, 'name': CreateVM_Task, 'duration_secs': 0.288904} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 569.770381] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 569.770381] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 569.770381] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 569.770381] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 569.770791] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0974ff0c-455f-454d-a86e-8865cd19fca0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.775414] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for the task: (returnval){ [ 569.775414] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e486c9-afdb-babd-2009-87d7a672bb09" [ 569.775414] env[60175]: _type = "Task" [ 569.775414] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 569.785192] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e486c9-afdb-babd-2009-87d7a672bb09, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 570.290021] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.290021] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 570.290021] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.738440] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Received event network-vif-plugged-cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 570.738657] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquiring lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.738865] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.739259] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.739458] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] No waiting events found dispatching network-vif-plugged-cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 570.739625] env[60175]: WARNING nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Received unexpected event network-vif-plugged-cef8131b-f245-4c5c-b33c-c3ccffa404fb for instance with vm_state building and task_state spawning. [ 570.739785] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Received event network-changed-cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 570.739936] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Refreshing instance network info cache due to event network-changed-cef8131b-f245-4c5c-b33c-c3ccffa404fb. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 570.740139] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquiring lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.740276] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquired lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.743450] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Refreshing network info cache for port cef8131b-f245-4c5c-b33c-c3ccffa404fb {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 570.769769] env[60175]: DEBUG nova.compute.manager [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Received event network-vif-plugged-2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 570.769769] env[60175]: DEBUG oslo_concurrency.lockutils [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] Acquiring lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.769769] env[60175]: DEBUG oslo_concurrency.lockutils [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.769769] env[60175]: DEBUG oslo_concurrency.lockutils [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.770420] env[60175]: DEBUG nova.compute.manager [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] No waiting events found dispatching network-vif-plugged-2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 570.770420] env[60175]: WARNING nova.compute.manager [req-c042a98f-6aa9-498b-9a9d-2a6238c7fa0c req-9275ed84-02b9-4839-821c-39767e0c35a4 service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Received unexpected event network-vif-plugged-2b078cf4-954e-4719-8e2e-3569c1ecf656 for instance with vm_state building and task_state spawning. [ 571.138078] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updated VIF entry in instance network info cache for port cef8131b-f245-4c5c-b33c-c3ccffa404fb. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 571.138324] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updating instance_info_cache with network_info: [{"id": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "address": "fa:16:3e:ac:ac:47", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcef8131b-f2", "ovs_interfaceid": "cef8131b-f245-4c5c-b33c-c3ccffa404fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 571.155674] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Releasing lock "refresh_cache-7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 571.155808] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Received event network-vif-plugged-1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 571.156042] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquiring lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 571.156246] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 571.156406] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 571.156568] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] No waiting events found dispatching network-vif-plugged-1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 571.157544] env[60175]: WARNING nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Received unexpected event network-vif-plugged-1df8b4df-6a45-4701-872a-25d8bdbb26c7 for instance with vm_state building and task_state spawning. [ 571.157544] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Received event network-changed-1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 571.157544] env[60175]: DEBUG nova.compute.manager [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Refreshing instance network info cache due to event network-changed-1df8b4df-6a45-4701-872a-25d8bdbb26c7. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 571.157544] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquiring lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 571.157544] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Acquired lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 571.157765] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Refreshing network info cache for port 1df8b4df-6a45-4701-872a-25d8bdbb26c7 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 571.814575] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updated VIF entry in instance network info cache for port 1df8b4df-6a45-4701-872a-25d8bdbb26c7. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 571.814922] env[60175]: DEBUG nova.network.neutron [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updating instance_info_cache with network_info: [{"id": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "address": "fa:16:3e:23:36:9c", "network": {"id": "23363b1d-446a-4887-9bbc-9e005797e022", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-529867696-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0f9d7fe46ca145d3983ce03907f5842c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1df8b4df-6a", "ovs_interfaceid": "1df8b4df-6a45-4701-872a-25d8bdbb26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 571.824011] env[60175]: DEBUG oslo_concurrency.lockutils [req-d7762327-c46c-4232-bda1-53536be89099 req-6a9c85b1-b452-4f67-868d-9c8fb9b519ec service nova] Releasing lock "refresh_cache-8bc7299c-35d4-4e9f-a243-2834fbadd987" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 572.156349] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "81af879b-3bc3-4aff-a99d-98d3aba73512" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.156556] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.169912] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 572.234376] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.235219] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.236349] env[60175]: INFO nova.compute.claims [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 572.310429] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.310730] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.436297] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff6b79e2-7ded-4802-b469-c718f8d3caa4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.445127] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18ef16ec-b5ff-448c-b505-f698d322031c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.481466] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21bef6f2-0407-4b6b-b7c5-8dd268ef2672 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.489018] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d005bed0-acf2-4525-b757-fa616499a771 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.503887] env[60175]: DEBUG nova.compute.provider_tree [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 572.514513] env[60175]: DEBUG nova.scheduler.client.report [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 572.531064] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.531565] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 572.569643] env[60175]: DEBUG nova.compute.utils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 572.571137] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 572.571304] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 572.579967] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 572.632923] env[60175]: DEBUG nova.policy [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd0925d391e2248eab9b7334e277d5d64', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b6a82227811a40cda939f7164f414da2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 572.656123] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 572.683895] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 572.684208] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 572.684679] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 572.685178] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 572.685387] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 572.685559] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 572.685793] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 572.686065] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 572.686274] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 572.686523] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 572.686664] env[60175]: DEBUG nova.virt.hardware [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 572.687590] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-898d5966-24ba-4e0f-8a04-67827c9a411c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.696927] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b4622ec-14de-4577-8ade-98051cc66ed9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.116805] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Successfully created port: 9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 574.327116] env[60175]: DEBUG nova.compute.manager [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Received event network-changed-2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 574.328261] env[60175]: DEBUG nova.compute.manager [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Refreshing instance network info cache due to event network-changed-2b078cf4-954e-4719-8e2e-3569c1ecf656. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 574.329435] env[60175]: DEBUG oslo_concurrency.lockutils [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] Acquiring lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.329435] env[60175]: DEBUG oslo_concurrency.lockutils [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] Acquired lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.329677] env[60175]: DEBUG nova.network.neutron [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Refreshing network info cache for port 2b078cf4-954e-4719-8e2e-3569c1ecf656 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 574.573324] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Successfully updated port: 9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 574.587204] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.587204] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquired lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.587204] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 574.700753] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 575.332970] env[60175]: DEBUG nova.network.neutron [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updated VIF entry in instance network info cache for port 2b078cf4-954e-4719-8e2e-3569c1ecf656. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 575.333357] env[60175]: DEBUG nova.network.neutron [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updating instance_info_cache with network_info: [{"id": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "address": "fa:16:3e:16:91:e9", "network": {"id": "40c36353-2cdb-4f60-a41c-ff342ffb24c4", "bridge": "br-int", "label": "tempest-ServersTestJSON-861103854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c984e18184364cb6a11bd2014bc377b3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f493cd8-1cb4-42a1-8d56-bfa6ac7cf563", "external-id": "nsx-vlan-transportzone-931", "segmentation_id": 931, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2b078cf4-95", "ovs_interfaceid": "2b078cf4-954e-4719-8e2e-3569c1ecf656", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.342287] env[60175]: DEBUG oslo_concurrency.lockutils [req-9d361dd9-e2ed-4e54-befd-7fb638855cb7 req-f51b3b9e-9b28-482a-8572-06145ca5257e service nova] Releasing lock "refresh_cache-da3eaeea-ce26-40eb-af8b-8857f927e431" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.463899] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updating instance_info_cache with network_info: [{"id": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "address": "fa:16:3e:42:41:bf", "network": {"id": "742512bd-47f6-46d4-84a0-b69b2f202cd4", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1970286857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b6a82227811a40cda939f7164f414da2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7e0240aa-a694-48fc-a0f9-6f2d3e71aa12", "external-id": "nsx-vlan-transportzone-249", "segmentation_id": 249, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f74c6a0-f8", "ovs_interfaceid": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.478578] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Releasing lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.478578] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance network_info: |[{"id": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "address": "fa:16:3e:42:41:bf", "network": {"id": "742512bd-47f6-46d4-84a0-b69b2f202cd4", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1970286857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b6a82227811a40cda939f7164f414da2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7e0240aa-a694-48fc-a0f9-6f2d3e71aa12", "external-id": "nsx-vlan-transportzone-249", "segmentation_id": 249, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f74c6a0-f8", "ovs_interfaceid": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 575.478801] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:42:41:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7e0240aa-a694-48fc-a0f9-6f2d3e71aa12', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9f74c6a0-f8e0-4193-a3dd-5609a74460e0', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 575.486957] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Creating folder: Project (b6a82227811a40cda939f7164f414da2). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 575.488933] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ae48e9e-0acf-4bb0-85e2-9a43d82323fd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.502831] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Created folder: Project (b6a82227811a40cda939f7164f414da2) in parent group-v845475. [ 575.502831] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Creating folder: Instances. Parent ref: group-v845503. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 575.503820] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2610064-066b-443a-9a0e-87074dff59f7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.513892] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Created folder: Instances in parent group-v845503. [ 575.513892] env[60175]: DEBUG oslo.service.loopingcall [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 575.514065] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 575.514999] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fe6c4b5f-6e4c-4ecd-b3e8-23f143a8d155 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.541025] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 575.541025] env[60175]: value = "task-4292864" [ 575.541025] env[60175]: _type = "Task" [ 575.541025] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 575.547668] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292864, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 576.050057] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292864, 'name': CreateVM_Task, 'duration_secs': 0.283591} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 576.050431] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 576.051842] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 576.052146] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 576.052539] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 576.053744] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-664b7b4c-b310-406e-bb8a-65be3b973654 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.060137] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for the task: (returnval){ [ 576.060137] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52aa8208-e085-2a7a-10d6-8e16724438ab" [ 576.060137] env[60175]: _type = "Task" [ 576.060137] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 576.067385] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52aa8208-e085-2a7a-10d6-8e16724438ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 576.570099] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 576.570377] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 576.570813] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 577.862902] env[60175]: DEBUG nova.compute.manager [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Received event network-vif-plugged-9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 577.863154] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Acquiring lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.863370] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.863481] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.863671] env[60175]: DEBUG nova.compute.manager [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] No waiting events found dispatching network-vif-plugged-9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 577.863791] env[60175]: WARNING nova.compute.manager [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Received unexpected event network-vif-plugged-9f74c6a0-f8e0-4193-a3dd-5609a74460e0 for instance with vm_state building and task_state spawning. [ 577.863943] env[60175]: DEBUG nova.compute.manager [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Received event network-changed-9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 577.864340] env[60175]: DEBUG nova.compute.manager [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Refreshing instance network info cache due to event network-changed-9f74c6a0-f8e0-4193-a3dd-5609a74460e0. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 577.864692] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Acquiring lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 577.864870] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Acquired lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 577.865044] env[60175]: DEBUG nova.network.neutron [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Refreshing network info cache for port 9f74c6a0-f8e0-4193-a3dd-5609a74460e0 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 578.622148] env[60175]: DEBUG nova.network.neutron [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updated VIF entry in instance network info cache for port 9f74c6a0-f8e0-4193-a3dd-5609a74460e0. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 578.622148] env[60175]: DEBUG nova.network.neutron [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updating instance_info_cache with network_info: [{"id": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "address": "fa:16:3e:42:41:bf", "network": {"id": "742512bd-47f6-46d4-84a0-b69b2f202cd4", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1970286857-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b6a82227811a40cda939f7164f414da2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7e0240aa-a694-48fc-a0f9-6f2d3e71aa12", "external-id": "nsx-vlan-transportzone-249", "segmentation_id": 249, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9f74c6a0-f8", "ovs_interfaceid": "9f74c6a0-f8e0-4193-a3dd-5609a74460e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 578.632063] env[60175]: DEBUG oslo_concurrency.lockutils [req-1e463225-d0cd-44a3-b4e6-3a9553a5dcbb req-0eeda6e3-b461-4f01-863b-9b7345107d01 service nova] Releasing lock "refresh_cache-81af879b-3bc3-4aff-a99d-98d3aba73512" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.125712] env[60175]: WARNING oslo_vmware.rw_handles [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 602.125712] env[60175]: ERROR oslo_vmware.rw_handles [ 602.125712] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 602.127116] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 602.128960] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Copying Virtual Disk [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/24f22618-a1c4-4ac8-b5f7-2d35133b9d1b/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 602.128960] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-739b1157-7bfc-40ad-a062-e735cf424168 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.140254] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for the task: (returnval){ [ 602.140254] env[60175]: value = "task-4292865" [ 602.140254] env[60175]: _type = "Task" [ 602.140254] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 602.150978] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Task: {'id': task-4292865, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 602.653424] env[60175]: DEBUG oslo_vmware.exceptions [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 602.654231] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.658805] env[60175]: ERROR nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 602.658805] env[60175]: Faults: ['InvalidArgument'] [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Traceback (most recent call last): [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] yield resources [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self.driver.spawn(context, instance, image_meta, [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self._vmops.spawn(context, instance, image_meta, injected_files, [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self._fetch_image_if_missing(context, vi) [ 602.658805] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] image_cache(vi, tmp_image_ds_loc) [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] vm_util.copy_virtual_disk( [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] session._wait_for_task(vmdk_copy_task) [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return self.wait_for_task(task_ref) [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return evt.wait() [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] result = hub.switch() [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 602.659227] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return self.greenlet.switch() [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self.f(*self.args, **self.kw) [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] raise exceptions.translate_fault(task_info.error) [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Faults: ['InvalidArgument'] [ 602.659626] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] [ 602.659626] env[60175]: INFO nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Terminating instance [ 602.662607] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 602.662607] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 602.662607] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 602.664465] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 602.664465] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 602.664859] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf0b478a-fcbb-47ec-8a85-98b9e7de0931 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.681255] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 602.681255] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 602.684302] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-96a2f07d-32d2-4ced-bb28-24152b092f72 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.691025] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Waiting for the task: (returnval){ [ 602.691025] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5246b48d-a86b-7fb9-ea69-c1995853fdb3" [ 602.691025] env[60175]: _type = "Task" [ 602.691025] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 602.702032] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5246b48d-a86b-7fb9-ea69-c1995853fdb3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 602.739764] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 602.982198] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 602.994026] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Releasing lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.994301] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 602.994493] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 602.995636] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0400faa0-7ca4-4acf-a34b-af757f66fe30 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.007995] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 603.008254] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-157d3c7c-c812-4d7d-85fe-81d823b64c01 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.063013] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 603.063013] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 603.063013] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Deleting the datastore file [datastore2] dedec08e-95d1-4467-96a4-cdec5f170e01 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 603.063013] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-be4eb0ae-0464-427d-9cf5-f1e0cce98a0b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.074040] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for the task: (returnval){ [ 603.074040] env[60175]: value = "task-4292870" [ 603.074040] env[60175]: _type = "Task" [ 603.074040] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 603.091564] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Task: {'id': task-4292870, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.202133] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 603.202825] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Creating directory with path [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 603.203912] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9c42289-ae7e-4439-8482-294933242e7c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.218154] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Created directory with path [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 603.218473] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Fetch image to [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 603.218661] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 603.219545] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81410cea-03bc-43c8-a9ef-9307e1442228 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.228304] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7929c1b8-0b75-400f-b42d-520f588a54f3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.238169] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d49bd5bd-59e8-4830-87e0-a08ca20cd3cc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.280368] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55444601-a994-4e9f-95cb-d8bbdfe5f08f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.287945] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7e99d65-b676-49ce-8dc0-82f5e407f3f3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.372355] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 603.467708] env[60175]: DEBUG oslo_vmware.rw_handles [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 603.545227] env[60175]: DEBUG oslo_vmware.rw_handles [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 603.545380] env[60175]: DEBUG oslo_vmware.rw_handles [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 603.588496] env[60175]: DEBUG oslo_vmware.api [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Task: {'id': task-4292870, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.055757} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 603.588887] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 603.589197] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 603.589564] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 603.590138] env[60175]: INFO nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Took 0.60 seconds to destroy the instance on the hypervisor. [ 603.590518] env[60175]: DEBUG oslo.service.loopingcall [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 603.590827] env[60175]: DEBUG nova.compute.manager [-] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Skipping network deallocation for instance since networking was not requested. {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 603.594150] env[60175]: DEBUG nova.compute.claims [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 603.594426] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.594772] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.833350] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dee86707-2fd7-4aae-97bd-6fb68e73ea4d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.844672] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f700d9-c652-4d8a-bb8f-4e0dde270372 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.879180] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d8fb77f-4191-4879-9453-945557cd7fef {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.887535] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bcefd68-89b1-4b74-a2c6-656e1e82b4b0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.902490] env[60175]: DEBUG nova.compute.provider_tree [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.915194] env[60175]: DEBUG nova.scheduler.client.report [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.935836] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.341s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.936915] env[60175]: ERROR nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.936915] env[60175]: Faults: ['InvalidArgument'] [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Traceback (most recent call last): [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self.driver.spawn(context, instance, image_meta, [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self._vmops.spawn(context, instance, image_meta, injected_files, [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self._fetch_image_if_missing(context, vi) [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] image_cache(vi, tmp_image_ds_loc) [ 603.936915] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] vm_util.copy_virtual_disk( [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] session._wait_for_task(vmdk_copy_task) [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return self.wait_for_task(task_ref) [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return evt.wait() [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] result = hub.switch() [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] return self.greenlet.switch() [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 603.937304] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] self.f(*self.args, **self.kw) [ 603.937681] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 603.937681] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] raise exceptions.translate_fault(task_info.error) [ 603.937681] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.937681] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Faults: ['InvalidArgument'] [ 603.937681] env[60175]: ERROR nova.compute.manager [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] [ 603.937681] env[60175]: DEBUG nova.compute.utils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 603.944159] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Build of instance dedec08e-95d1-4467-96a4-cdec5f170e01 was re-scheduled: A specified parameter was not correct: fileType [ 603.944159] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 603.944159] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 603.944159] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquiring lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.944159] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Acquired lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.945208] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 604.074789] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 604.623765] env[60175]: DEBUG nova.network.neutron [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 604.634252] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Releasing lock "refresh_cache-dedec08e-95d1-4467-96a4-cdec5f170e01" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 604.634482] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 604.634671] env[60175]: DEBUG nova.compute.manager [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] [instance: dedec08e-95d1-4467-96a4-cdec5f170e01] Skipping network deallocation for instance since networking was not requested. {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 604.750440] env[60175]: INFO nova.scheduler.client.report [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Deleted allocations for instance dedec08e-95d1-4467-96a4-cdec5f170e01 [ 604.772219] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2d6621be-57d9-4a24-912b-a1842c374951 tempest-ServerDiagnosticsV248Test-720135860 tempest-ServerDiagnosticsV248Test-720135860-project-member] Lock "dedec08e-95d1-4467-96a4-cdec5f170e01" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.031s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.811075] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 604.889620] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.889801] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.892426] env[60175]: INFO nova.compute.claims [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 605.096533] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e68357b-16a6-4a5f-8c4c-ae1fecc4eaf4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.103487] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b216f56-e7c2-4147-a909-7bec97d6d85d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.142495] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07034cc-e7a0-4700-aeb2-ad5d361ffe2f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.156583] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d57dc5a6-0d2f-440f-8a04-fac6123e9fce {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.172284] env[60175]: DEBUG nova.compute.provider_tree [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 605.190370] env[60175]: DEBUG nova.scheduler.client.report [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 605.215450] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.216249] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 605.258111] env[60175]: DEBUG nova.compute.utils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 605.262109] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 605.262109] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 605.275671] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 605.358623] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 605.419610] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 605.419917] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 605.420382] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 605.420935] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 605.420935] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 605.421179] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 605.421254] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 605.421958] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 605.421958] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 605.421958] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 605.421958] env[60175]: DEBUG nova.virt.hardware [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 605.423010] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fc5d1cb-82bb-4a48-bee8-0babfa0c8150 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.440187] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11b006da-0e41-4dca-91f4-96216b85272a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.511651] env[60175]: DEBUG nova.policy [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8ec9debbb6274965b14fd444ab31e352', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '911ada63d0ee4b5f965cb5d251ab5a78', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 607.401095] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Successfully created port: c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 609.834291] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Successfully updated port: c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 609.849922] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.850087] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquired lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.850240] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 609.996682] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 611.172870] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updating instance_info_cache with network_info: [{"id": "c264df7c-be7d-4122-afa3-66b68fcebc70", "address": "fa:16:3e:c0:2e:53", "network": {"id": "a5394e14-04c6-4270-9684-414b98ee0e0a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-587671394-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "911ada63d0ee4b5f965cb5d251ab5a78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc264df7c-be", "ovs_interfaceid": "c264df7c-be7d-4122-afa3-66b68fcebc70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 611.186061] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Releasing lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.186061] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance network_info: |[{"id": "c264df7c-be7d-4122-afa3-66b68fcebc70", "address": "fa:16:3e:c0:2e:53", "network": {"id": "a5394e14-04c6-4270-9684-414b98ee0e0a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-587671394-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "911ada63d0ee4b5f965cb5d251ab5a78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc264df7c-be", "ovs_interfaceid": "c264df7c-be7d-4122-afa3-66b68fcebc70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 611.186263] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:2e:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed8a78a1-87dc-488e-a092-afd1c2a2ddde', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c264df7c-be7d-4122-afa3-66b68fcebc70', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 611.197369] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Creating folder: Project (911ada63d0ee4b5f965cb5d251ab5a78). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 611.197651] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2845c880-e552-4b5b-93a6-70e793590039 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.212900] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Created folder: Project (911ada63d0ee4b5f965cb5d251ab5a78) in parent group-v845475. [ 611.213196] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Creating folder: Instances. Parent ref: group-v845509. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 611.213489] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-287a6f53-3a8d-497c-9d8e-a6812e7d5b8a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.223539] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Created folder: Instances in parent group-v845509. [ 611.223808] env[60175]: DEBUG oslo.service.loopingcall [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 611.224125] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 611.224215] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-45552909-e0e9-4e94-ac2e-c7438955a829 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.246014] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 611.246014] env[60175]: value = "task-4292877" [ 611.246014] env[60175]: _type = "Task" [ 611.246014] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 611.254349] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292877, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 611.758034] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292877, 'name': CreateVM_Task, 'duration_secs': 0.308278} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 611.758150] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 611.759298] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.759564] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.759882] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 611.760145] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-421acc8a-3f0e-4567-84e9-847cb8ce1ad0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.765248] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for the task: (returnval){ [ 611.765248] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52980c70-2704-9397-06d3-0128a74af30a" [ 611.765248] env[60175]: _type = "Task" [ 611.765248] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 611.773485] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52980c70-2704-9397-06d3-0128a74af30a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.044399] env[60175]: DEBUG nova.compute.manager [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Received event network-vif-plugged-c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 612.044611] env[60175]: DEBUG oslo_concurrency.lockutils [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] Acquiring lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 612.045606] env[60175]: DEBUG oslo_concurrency.lockutils [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 612.045606] env[60175]: DEBUG oslo_concurrency.lockutils [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 612.045606] env[60175]: DEBUG nova.compute.manager [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] No waiting events found dispatching network-vif-plugged-c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 612.046806] env[60175]: WARNING nova.compute.manager [req-95968086-2758-4ad9-92d9-23a7c92a7d90 req-2146c6ff-b51b-4069-8690-f0987c7e2e0e service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Received unexpected event network-vif-plugged-c264df7c-be7d-4122-afa3-66b68fcebc70 for instance with vm_state building and task_state spawning. [ 612.281796] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.281796] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 612.281796] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.983229] env[60175]: DEBUG nova.compute.manager [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Received event network-changed-c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 618.983515] env[60175]: DEBUG nova.compute.manager [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Refreshing instance network info cache due to event network-changed-c264df7c-be7d-4122-afa3-66b68fcebc70. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 618.983898] env[60175]: DEBUG oslo_concurrency.lockutils [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] Acquiring lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.983898] env[60175]: DEBUG oslo_concurrency.lockutils [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] Acquired lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.984026] env[60175]: DEBUG nova.network.neutron [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Refreshing network info cache for port c264df7c-be7d-4122-afa3-66b68fcebc70 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 620.249868] env[60175]: DEBUG nova.network.neutron [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updated VIF entry in instance network info cache for port c264df7c-be7d-4122-afa3-66b68fcebc70. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 620.250306] env[60175]: DEBUG nova.network.neutron [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updating instance_info_cache with network_info: [{"id": "c264df7c-be7d-4122-afa3-66b68fcebc70", "address": "fa:16:3e:c0:2e:53", "network": {"id": "a5394e14-04c6-4270-9684-414b98ee0e0a", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-587671394-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "911ada63d0ee4b5f965cb5d251ab5a78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed8a78a1-87dc-488e-a092-afd1c2a2ddde", "external-id": "nsx-vlan-transportzone-21", "segmentation_id": 21, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc264df7c-be", "ovs_interfaceid": "c264df7c-be7d-4122-afa3-66b68fcebc70", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.261244] env[60175]: DEBUG oslo_concurrency.lockutils [req-bb5d9c1a-4476-4b51-b351-0c48e0f58b89 req-e6c3f5aa-8aba-4ff5-a92d-955ef7dad190 service nova] Releasing lock "refresh_cache-843d4db6-c1fb-4b74-ad3c-779e309a170e" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.444524] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 623.487235] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 623.487235] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 623.487439] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 623.502502] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.502780] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.503832] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.503832] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 623.504684] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a53134fe-5b4a-4e4e-8730-e2cec9bf75c2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.513961] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6565abd-cae6-4ee1-a9f7-09a5c931c3eb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.529405] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdfb194a-e21a-4169-b016-fdefb0bad5ea {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.536392] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-843f47fd-a5fc-4d5d-8ac6-2e2fd60ca89f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.572044] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180665MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 623.572252] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.572453] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.650356] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance a0755f79-7df4-4660-92e6-5dd80af94aaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.650514] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance b5636e10-af08-49d3-a9b2-8122521a9e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.650646] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.650772] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.650893] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 72caf1e5-e894-4581-a95d-21dda85e11b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651278] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651278] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651278] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651452] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651499] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 623.651698] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 623.652484] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=149GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 623.828281] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593a5e64-80dc-489b-9de9-f4986cba9743 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.837675] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcae2cc3-ffe7-47b7-9218-b1e78ae31e57 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.871927] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea67c865-34bf-4801-aa56-0eb4c2205212 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.880698] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1bde7c-0ca7-427e-87d7-f0e15d0e1bb6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.893202] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 623.906077] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 623.928036] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 623.928258] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.390973] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.391294] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.392509] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 625.392509] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 625.419670] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.419835] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420000] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420142] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420262] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420378] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420492] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420609] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420719] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420831] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 625.420946] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 625.421469] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.421634] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.421782] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.421931] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 625.422063] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 636.039698] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "029d2099-2e55-4632-81b6-b59d6a20faab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.040438] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "029d2099-2e55-4632-81b6-b59d6a20faab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.988652] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "068814dd-328c-48d1-b514-34eb43b0f2b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.988945] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "068814dd-328c-48d1-b514-34eb43b0f2b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.314588] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.314854] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.309192] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "39a2035c-bb7b-4837-b556-e8bb38ffb514" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.309442] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "39a2035c-bb7b-4837-b556-e8bb38ffb514" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.558170] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "57a5dcae-6861-418a-a041-9cd5b7a43982" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.558736] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "57a5dcae-6861-418a-a041-9cd5b7a43982" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 648.544710] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Acquiring lock "6c94c59c-44ab-4cb9-8480-18e8a424993b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 648.545035] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "6c94c59c-44ab-4cb9-8480-18e8a424993b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.137052] env[60175]: WARNING oslo_vmware.rw_handles [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 653.137052] env[60175]: ERROR oslo_vmware.rw_handles [ 653.137631] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 653.139044] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 653.139317] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Copying Virtual Disk [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/04d09432-24cd-447e-8669-7dc99ced1595/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 653.139691] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1b730ca8-50c0-483f-9865-9387ec2c0c42 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.148680] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Waiting for the task: (returnval){ [ 653.148680] env[60175]: value = "task-4292893" [ 653.148680] env[60175]: _type = "Task" [ 653.148680] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 653.157192] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Task: {'id': task-4292893, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 653.661310] env[60175]: DEBUG oslo_vmware.exceptions [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 653.661748] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 653.662702] env[60175]: ERROR nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 653.662702] env[60175]: Faults: ['InvalidArgument'] [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Traceback (most recent call last): [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] yield resources [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self.driver.spawn(context, instance, image_meta, [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self._fetch_image_if_missing(context, vi) [ 653.662702] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] image_cache(vi, tmp_image_ds_loc) [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] vm_util.copy_virtual_disk( [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] session._wait_for_task(vmdk_copy_task) [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return self.wait_for_task(task_ref) [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return evt.wait() [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] result = hub.switch() [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 653.663087] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return self.greenlet.switch() [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self.f(*self.args, **self.kw) [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] raise exceptions.translate_fault(task_info.error) [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Faults: ['InvalidArgument'] [ 653.663489] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] [ 653.665287] env[60175]: INFO nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Terminating instance [ 653.667094] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 653.667094] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 653.667545] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 653.667844] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 653.668172] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cab012c8-f2a7-48d7-95d9-4a846d8f8723 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.672223] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9802fbde-177a-48f6-815a-abcc4512eb5e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.679098] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 653.679470] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-913588c6-f297-4c56-b968-a0fe6e2088d3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.681907] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 653.685022] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 653.685022] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a8a118f-de49-4f92-b065-5ef0e9ff0c9a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.688686] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Waiting for the task: (returnval){ [ 653.688686] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5274159a-305d-8b36-2808-c00b78e7d04c" [ 653.688686] env[60175]: _type = "Task" [ 653.688686] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 653.696757] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5274159a-305d-8b36-2808-c00b78e7d04c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 653.758024] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 653.758024] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 653.758024] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Deleting the datastore file [datastore2] a0755f79-7df4-4660-92e6-5dd80af94aaa {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 653.758024] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2c8f9cfb-19d3-4e10-949b-0b728fe386e6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.766023] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Waiting for the task: (returnval){ [ 653.766023] env[60175]: value = "task-4292895" [ 653.766023] env[60175]: _type = "Task" [ 653.766023] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 653.774024] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Task: {'id': task-4292895, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 654.199443] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 654.199748] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Creating directory with path [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 654.199973] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0f3f16b-6801-45d6-ab6a-81d3a0818ad8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.215016] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Created directory with path [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 654.215252] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Fetch image to [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 654.215424] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 654.216201] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c260108-3566-4df3-ba9f-d880cb4995bc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.224277] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad47e6ad-d823-4851-803e-2a1d07936251 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.235294] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-943a7a4d-bcec-4e0f-a6d6-ca8fa4db873b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.281270] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b2b197-b682-4a6b-ba75-5f825c16ac83 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.291328] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f4ff2a17-0be5-41c3-9d83-ecfb5230b29f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.293313] env[60175]: DEBUG oslo_vmware.api [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Task: {'id': task-4292895, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061203} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 654.293623] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 654.293847] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 654.294107] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 654.294312] env[60175]: INFO nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Took 0.63 seconds to destroy the instance on the hypervisor. [ 654.296566] env[60175]: DEBUG nova.compute.claims [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 654.296864] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.297166] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.315676] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 654.378017] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 654.441262] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 654.441645] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 654.642417] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07ed6de-02b7-4b41-b886-a48c5595c06f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.650810] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a884f96-4120-4aea-9643-fce2f95ec441 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.682894] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9838235-a185-470e-9a13-a5d6bc6509bd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.690946] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd5a8dd-f940-45a6-87e0-6ca9a05dd008 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.705318] env[60175]: DEBUG nova.compute.provider_tree [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.719024] env[60175]: DEBUG nova.scheduler.client.report [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.739477] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.441s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.739477] env[60175]: ERROR nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 654.739477] env[60175]: Faults: ['InvalidArgument'] [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Traceback (most recent call last): [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self.driver.spawn(context, instance, image_meta, [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 654.739477] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self._fetch_image_if_missing(context, vi) [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] image_cache(vi, tmp_image_ds_loc) [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] vm_util.copy_virtual_disk( [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] session._wait_for_task(vmdk_copy_task) [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return self.wait_for_task(task_ref) [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return evt.wait() [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] result = hub.switch() [ 654.740358] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] return self.greenlet.switch() [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] self.f(*self.args, **self.kw) [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] raise exceptions.translate_fault(task_info.error) [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Faults: ['InvalidArgument'] [ 654.740699] env[60175]: ERROR nova.compute.manager [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] [ 654.740699] env[60175]: DEBUG nova.compute.utils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 654.742424] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Build of instance a0755f79-7df4-4660-92e6-5dd80af94aaa was re-scheduled: A specified parameter was not correct: fileType [ 654.742424] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 654.742424] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 654.742790] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 654.742957] env[60175]: DEBUG nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 654.743132] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 655.582806] env[60175]: DEBUG nova.network.neutron [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.601442] env[60175]: INFO nova.compute.manager [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] [instance: a0755f79-7df4-4660-92e6-5dd80af94aaa] Took 0.86 seconds to deallocate network for instance. [ 655.724336] env[60175]: INFO nova.scheduler.client.report [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Deleted allocations for instance a0755f79-7df4-4660-92e6-5dd80af94aaa [ 655.756266] env[60175]: DEBUG oslo_concurrency.lockutils [None req-76e2c931-b8a5-473d-9e30-4e11e608f98a tempest-ServerExternalEventsTest-1135534376 tempest-ServerExternalEventsTest-1135534376-project-member] Lock "a0755f79-7df4-4660-92e6-5dd80af94aaa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.576s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.772507] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 655.874528] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.874776] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.876505] env[60175]: INFO nova.compute.claims [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 656.058087] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.058351] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.179378] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a1ad18-becf-44c3-a608-250d86616ed2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.189032] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d829b51-3345-418a-9796-17f2fd601c97 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.223780] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a468ee44-d64d-4a72-a3b3-5c13232df0f2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.232083] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08116b4b-6c39-4f43-bd50-f3cf3103eb84 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.247460] env[60175]: DEBUG nova.compute.provider_tree [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 656.260574] env[60175]: DEBUG nova.scheduler.client.report [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 656.275465] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.401s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.276028] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 656.313326] env[60175]: DEBUG nova.compute.utils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 656.314605] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 656.314776] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 656.327776] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 656.402485] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 656.424036] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 656.424151] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 656.424831] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 656.424831] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 656.424831] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 656.424831] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 656.424989] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 656.425124] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 656.425285] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 656.425441] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 656.427189] env[60175]: DEBUG nova.virt.hardware [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 656.427189] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b8a54a-b66f-41c2-82f7-79ce6b07e47b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.434520] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0da691b5-8749-401f-a01c-3213d7f30d2f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.459026] env[60175]: DEBUG nova.policy [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd7795b536d448fb9aed5ab18496fc5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '94fa7bddb7f64f01baa46ea6cba2bdb1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 657.294838] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Acquiring lock "71244679-78d6-4d49-b4b5-ef96fd313ae8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.295567] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "71244679-78d6-4d49-b4b5-ef96fd313ae8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.618210] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Successfully created port: 5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 658.734855] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Successfully updated port: 5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 658.749022] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 658.749022] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquired lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 658.749910] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 658.824256] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.080534] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Updating instance_info_cache with network_info: [{"id": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "address": "fa:16:3e:54:91:c0", "network": {"id": "15a5fb8a-f254-4fec-9290-6fb8aa23774e", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-868109212-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94fa7bddb7f64f01baa46ea6cba2bdb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e559ae9-2b", "ovs_interfaceid": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.094424] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Releasing lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 659.094880] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance network_info: |[{"id": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "address": "fa:16:3e:54:91:c0", "network": {"id": "15a5fb8a-f254-4fec-9290-6fb8aa23774e", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-868109212-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94fa7bddb7f64f01baa46ea6cba2bdb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e559ae9-2b", "ovs_interfaceid": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 659.095190] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:54:91:c0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ccc845e3-654b-43c6-acea-dde1084f0ad0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5e559ae9-2ba8-4907-b16c-34d4c09c7d10', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 659.107260] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Creating folder: Project (94fa7bddb7f64f01baa46ea6cba2bdb1). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.108236] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b9f37cf6-c70d-42f2-a895-75e19637f5e0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.120454] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Created folder: Project (94fa7bddb7f64f01baa46ea6cba2bdb1) in parent group-v845475. [ 659.120662] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Creating folder: Instances. Parent ref: group-v845517. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 659.120911] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0f2f5944-6e01-437d-96f8-13ae0011cab9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.131328] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Created folder: Instances in parent group-v845517. [ 659.131328] env[60175]: DEBUG oslo.service.loopingcall [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 659.131328] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 659.131328] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-78cc5ab0-bf51-4e6c-a736-cf9d1ebca954 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.153083] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 659.153083] env[60175]: value = "task-4292898" [ 659.153083] env[60175]: _type = "Task" [ 659.153083] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 659.163752] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292898, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 659.497035] env[60175]: DEBUG nova.compute.manager [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Received event network-vif-plugged-5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 659.497361] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Acquiring lock "029d2099-2e55-4632-81b6-b59d6a20faab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.497513] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Lock "029d2099-2e55-4632-81b6-b59d6a20faab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.497674] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Lock "029d2099-2e55-4632-81b6-b59d6a20faab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.497877] env[60175]: DEBUG nova.compute.manager [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] No waiting events found dispatching network-vif-plugged-5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 659.497990] env[60175]: WARNING nova.compute.manager [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Received unexpected event network-vif-plugged-5e559ae9-2ba8-4907-b16c-34d4c09c7d10 for instance with vm_state building and task_state spawning. [ 659.498166] env[60175]: DEBUG nova.compute.manager [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Received event network-changed-5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 659.498317] env[60175]: DEBUG nova.compute.manager [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Refreshing instance network info cache due to event network-changed-5e559ae9-2ba8-4907-b16c-34d4c09c7d10. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 659.500927] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Acquiring lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 659.500927] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Acquired lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 659.500927] env[60175]: DEBUG nova.network.neutron [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Refreshing network info cache for port 5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 659.664717] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292898, 'name': CreateVM_Task, 'duration_secs': 0.306899} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 659.665150] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 659.666089] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 659.668018] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 659.668018] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 659.668018] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5564bc17-dc7c-409d-9086-e87d1434c6c6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.676018] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Waiting for the task: (returnval){ [ 659.676018] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]529b5d5a-8f2e-b010-6d72-6bc1dfb3765d" [ 659.676018] env[60175]: _type = "Task" [ 659.676018] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 659.686146] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]529b5d5a-8f2e-b010-6d72-6bc1dfb3765d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 660.191823] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 660.192113] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 660.192317] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 660.346082] env[60175]: DEBUG nova.network.neutron [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Updated VIF entry in instance network info cache for port 5e559ae9-2ba8-4907-b16c-34d4c09c7d10. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 660.346455] env[60175]: DEBUG nova.network.neutron [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Updating instance_info_cache with network_info: [{"id": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "address": "fa:16:3e:54:91:c0", "network": {"id": "15a5fb8a-f254-4fec-9290-6fb8aa23774e", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-868109212-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "94fa7bddb7f64f01baa46ea6cba2bdb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ccc845e3-654b-43c6-acea-dde1084f0ad0", "external-id": "nsx-vlan-transportzone-344", "segmentation_id": 344, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5e559ae9-2b", "ovs_interfaceid": "5e559ae9-2ba8-4907-b16c-34d4c09c7d10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 660.361143] env[60175]: DEBUG oslo_concurrency.lockutils [req-fc642cd7-82b5-4cff-861f-1594beb2b90e req-878fbfd3-0343-4b80-a233-709b918075c3 service nova] Releasing lock "refresh_cache-029d2099-2e55-4632-81b6-b59d6a20faab" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 664.360752] env[60175]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Acquiring lock "fb825c5f-bd66-40aa-8027-cb425f3b9b96" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.361059] env[60175]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "fb825c5f-bd66-40aa-8027-cb425f3b9b96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.946697] env[60175]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Acquiring lock "c76409ad-b0aa-4da6-ac83-58f617ec2588" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.947571] env[60175]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "c76409ad-b0aa-4da6-ac83-58f617ec2588" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.172426] env[60175]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Acquiring lock "63823a4b-97e0-48f9-9fb9-7c4fe3858343" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.172690] env[60175]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "63823a4b-97e0-48f9-9fb9-7c4fe3858343" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.073608] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "070d142d-6a47-49bc-a061-3101da79447a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.073860] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "070d142d-6a47-49bc-a061-3101da79447a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.951074] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 682.951388] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 682.951462] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 682.961866] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.962138] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.962313] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.962482] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 682.963587] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e13e86b5-2f3b-41af-8756-6452d466e205 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.972422] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92c6e9a9-62f0-463c-b9d3-74279aeda499 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.986529] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aeb5c10-5118-4511-a433-a59a6f8f44e1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.992824] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a190c2f5-6f9d-4371-9fb1-24a2a45cecfa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.022581] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180629MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 683.022745] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.022941] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.085773] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance b5636e10-af08-49d3-a9b2-8122521a9e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.085993] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086154] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086278] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 72caf1e5-e894-4581-a95d-21dda85e11b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086432] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086568] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086687] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086801] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.086914] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.087067] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 029d2099-2e55-4632-81b6-b59d6a20faab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 683.113450] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.136234] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.145539] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.154297] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 57a5dcae-6861-418a-a041-9cd5b7a43982 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.162810] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 6c94c59c-44ab-4cb9-8480-18e8a424993b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.171204] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.179432] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 71244679-78d6-4d49-b4b5-ef96fd313ae8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.187858] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance fb825c5f-bd66-40aa-8027-cb425f3b9b96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.196287] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance c76409ad-b0aa-4da6-ac83-58f617ec2588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.204660] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 63823a4b-97e0-48f9-9fb9-7c4fe3858343 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.213183] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 070d142d-6a47-49bc-a061-3101da79447a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 683.213409] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 683.213556] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=149GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 683.441418] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-416230d6-c2fa-42c3-aebb-4f520ebaeb42 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.449804] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b341330e-5e26-4e4d-8642-e3040ab02643 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.479713] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f17cf13-d8d7-4e91-adb3-c91c14853572 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.487177] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96d487d7-e866-4c1a-b323-0658407df6ed {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.500234] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 683.508482] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 683.522221] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 683.522395] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.499s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.516992] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.517265] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.517411] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 685.517705] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 685.537525] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.537890] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.538149] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.538401] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.538629] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.538852] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.539084] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.539320] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.539587] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.539829] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 685.540060] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 685.540623] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.540933] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.950276] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.950461] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 685.950654] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 703.155045] env[60175]: WARNING oslo_vmware.rw_handles [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 703.155045] env[60175]: ERROR oslo_vmware.rw_handles [ 703.155045] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 703.156668] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 703.156944] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Copying Virtual Disk [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/f1fa1a70-4838-4e87-b20a-1a9833468cb7/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 703.157360] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0e232262-3def-4b30-ae89-ee70475e6202 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.165282] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Waiting for the task: (returnval){ [ 703.165282] env[60175]: value = "task-4292899" [ 703.165282] env[60175]: _type = "Task" [ 703.165282] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.173753] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Task: {'id': task-4292899, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 703.676065] env[60175]: DEBUG oslo_vmware.exceptions [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 703.676065] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.676622] env[60175]: ERROR nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.676622] env[60175]: Faults: ['InvalidArgument'] [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Traceback (most recent call last): [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] yield resources [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self.driver.spawn(context, instance, image_meta, [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self._fetch_image_if_missing(context, vi) [ 703.676622] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] image_cache(vi, tmp_image_ds_loc) [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] vm_util.copy_virtual_disk( [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] session._wait_for_task(vmdk_copy_task) [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return self.wait_for_task(task_ref) [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return evt.wait() [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] result = hub.switch() [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.676998] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return self.greenlet.switch() [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self.f(*self.args, **self.kw) [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] raise exceptions.translate_fault(task_info.error) [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Faults: ['InvalidArgument'] [ 703.677373] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] [ 703.677373] env[60175]: INFO nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Terminating instance [ 703.678409] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.678604] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 703.679013] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-74711714-c4d2-4e98-87e4-199db115dc5b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.681118] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 703.681301] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 703.682023] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b12058-dedb-40ae-9ba7-643d9a97ec5a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.689434] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 703.689643] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d648a2a4-0ae3-4664-a3f1-82610b082230 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.693202] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 703.693369] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 703.694010] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17a35c2e-aa1f-4925-90ea-6f8e68070b53 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.698683] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Waiting for the task: (returnval){ [ 703.698683] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52000e0b-4b7d-0bf2-3c2d-3de2036fa330" [ 703.698683] env[60175]: _type = "Task" [ 703.698683] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.705797] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52000e0b-4b7d-0bf2-3c2d-3de2036fa330, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 703.754179] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 703.754412] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 703.754589] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Deleting the datastore file [datastore2] 99d97004-9f23-48ee-a88b-75fdb6acc4b8 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 703.754860] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8351d6fd-aa28-4e3a-a471-3a7930dd6c72 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.761561] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Waiting for the task: (returnval){ [ 703.761561] env[60175]: value = "task-4292901" [ 703.761561] env[60175]: _type = "Task" [ 703.761561] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.769515] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Task: {'id': task-4292901, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.210035] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 704.210035] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Creating directory with path [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 704.210035] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c29890de-b346-4db8-9a5a-5e5f5de2ab72 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.221344] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Created directory with path [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 704.221577] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Fetch image to [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 704.221794] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 704.222567] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-589ab120-4c7d-4c89-9d13-536d7b96a5ab {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.229036] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2cf7f92-f9ff-4b02-895f-3b9caf279e50 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.238011] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81b860c9-faa5-4433-8aa1-b4f5e340e823 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.272030] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfb5aefc-b3be-4af1-a690-8f6598db6574 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.278980] env[60175]: DEBUG oslo_vmware.api [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Task: {'id': task-4292901, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074764} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 704.280524] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 704.280712] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 704.280899] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 704.281081] env[60175]: INFO nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 704.282856] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7e433e30-3c16-494c-8be1-7f713aba0a39 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.284744] env[60175]: DEBUG nova.compute.claims [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 704.284914] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.285137] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.383023] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 704.427990] env[60175]: DEBUG oslo_vmware.rw_handles [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 704.486772] env[60175]: DEBUG oslo_vmware.rw_handles [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 704.487012] env[60175]: DEBUG oslo_vmware.rw_handles [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 704.616266] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4de9456-673c-44e0-93fe-b8d650642151 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.623825] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb9e3536-dd39-47f8-afc1-3971453044fe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.652512] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9604506a-3168-411b-8a6d-b7d81375f533 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.659310] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed92254f-a6c8-4f03-8ef1-c5c46eb5ea15 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.672867] env[60175]: DEBUG nova.compute.provider_tree [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.680504] env[60175]: DEBUG nova.scheduler.client.report [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.693581] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.408s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.694088] env[60175]: ERROR nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 704.694088] env[60175]: Faults: ['InvalidArgument'] [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Traceback (most recent call last): [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self.driver.spawn(context, instance, image_meta, [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self._fetch_image_if_missing(context, vi) [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] image_cache(vi, tmp_image_ds_loc) [ 704.694088] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] vm_util.copy_virtual_disk( [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] session._wait_for_task(vmdk_copy_task) [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return self.wait_for_task(task_ref) [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return evt.wait() [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] result = hub.switch() [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] return self.greenlet.switch() [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 704.694455] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] self.f(*self.args, **self.kw) [ 704.694816] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 704.694816] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] raise exceptions.translate_fault(task_info.error) [ 704.694816] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 704.694816] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Faults: ['InvalidArgument'] [ 704.694816] env[60175]: ERROR nova.compute.manager [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] [ 704.694816] env[60175]: DEBUG nova.compute.utils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 704.696016] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Build of instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 was re-scheduled: A specified parameter was not correct: fileType [ 704.696016] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 704.696384] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 704.696551] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 704.696700] env[60175]: DEBUG nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 704.696857] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.057467] env[60175]: DEBUG nova.network.neutron [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.075956] env[60175]: INFO nova.compute.manager [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] [instance: 99d97004-9f23-48ee-a88b-75fdb6acc4b8] Took 0.38 seconds to deallocate network for instance. [ 705.171506] env[60175]: INFO nova.scheduler.client.report [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Deleted allocations for instance 99d97004-9f23-48ee-a88b-75fdb6acc4b8 [ 705.194432] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a747f954-c669-4851-9f0f-ab9b73b29f0f tempest-DeleteServersAdminTestJSON-579292529 tempest-DeleteServersAdminTestJSON-579292529-project-member] Lock "99d97004-9f23-48ee-a88b-75fdb6acc4b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 151.746s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.207021] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 705.255587] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.255859] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.257408] env[60175]: INFO nova.compute.claims [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 705.548848] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8146185a-0f75-45ce-bf15-346801b4db45 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.557169] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffbe6010-6d7d-4ca6-b959-682ba7787e37 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.589028] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e48bedf2-8621-4ffb-866c-d57b5c2ca025 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.595234] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b12d2a-b564-47cd-8b37-665702977773 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.608184] env[60175]: DEBUG nova.compute.provider_tree [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.617630] env[60175]: DEBUG nova.scheduler.client.report [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.632617] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.633138] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 705.667075] env[60175]: DEBUG nova.compute.utils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 705.668435] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 705.668658] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 705.677714] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 705.740437] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 705.762461] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 705.762701] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 705.762860] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 705.763096] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 705.763268] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 705.763414] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 705.763613] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 705.763766] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 705.763931] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 705.764112] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 705.764285] env[60175]: DEBUG nova.virt.hardware [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 705.765143] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b084105-86d4-4033-9623-8bf6fd2434e1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.774623] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b49c789b-ccc5-4989-bb6b-70935a8d7aae {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.929115] env[60175]: DEBUG nova.policy [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '81d0dfe7783f4cfebc10dafb19a456fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '58b9dd37c4a94f59a5afc2c931ee30a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.437671] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Successfully created port: 9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 707.413543] env[60175]: DEBUG nova.compute.manager [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Received event network-vif-plugged-9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 707.413887] env[60175]: DEBUG oslo_concurrency.lockutils [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] Acquiring lock "068814dd-328c-48d1-b514-34eb43b0f2b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.414102] env[60175]: DEBUG oslo_concurrency.lockutils [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] Lock "068814dd-328c-48d1-b514-34eb43b0f2b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.414267] env[60175]: DEBUG oslo_concurrency.lockutils [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] Lock "068814dd-328c-48d1-b514-34eb43b0f2b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.414431] env[60175]: DEBUG nova.compute.manager [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] No waiting events found dispatching network-vif-plugged-9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 707.414591] env[60175]: WARNING nova.compute.manager [req-632012dd-2f02-4d16-9761-640837d8303c req-eaad65a6-a66a-468e-ae53-d283aa938cfa service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Received unexpected event network-vif-plugged-9fc574b7-1869-4a09-8bd5-72d1381e6c73 for instance with vm_state building and task_state spawning. [ 707.474822] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Successfully updated port: 9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 707.489566] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.489873] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquired lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.489968] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.555871] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.840123] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Updating instance_info_cache with network_info: [{"id": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "address": "fa:16:3e:d8:76:5e", "network": {"id": "649a49bb-a50a-40b7-8692-8fbbc71ae710", "bridge": "br-int", "label": "tempest-ServersTestJSON-1016732347-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58b9dd37c4a94f59a5afc2c931ee30a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9fc574b7-18", "ovs_interfaceid": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.853052] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Releasing lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.853052] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance network_info: |[{"id": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "address": "fa:16:3e:d8:76:5e", "network": {"id": "649a49bb-a50a-40b7-8692-8fbbc71ae710", "bridge": "br-int", "label": "tempest-ServersTestJSON-1016732347-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58b9dd37c4a94f59a5afc2c931ee30a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9fc574b7-18", "ovs_interfaceid": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 707.853230] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d8:76:5e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '60567ee6-01d0-4b16-9c7a-4a896827d6eb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9fc574b7-1869-4a09-8bd5-72d1381e6c73', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 707.860007] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Creating folder: Project (58b9dd37c4a94f59a5afc2c931ee30a9). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.860513] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9887bdef-7e82-476d-a91f-7bd3d2f65e5f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.874118] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Created folder: Project (58b9dd37c4a94f59a5afc2c931ee30a9) in parent group-v845475. [ 707.874400] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Creating folder: Instances. Parent ref: group-v845520. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 707.874628] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f9b57894-2701-440a-997a-a317ac807ed5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.883126] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Created folder: Instances in parent group-v845520. [ 707.883352] env[60175]: DEBUG oslo.service.loopingcall [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 707.883525] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 707.883711] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b298e15a-bf02-4232-b012-813700ca4264 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.902974] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 707.902974] env[60175]: value = "task-4292904" [ 707.902974] env[60175]: _type = "Task" [ 707.902974] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 707.911711] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292904, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 708.412946] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292904, 'name': CreateVM_Task, 'duration_secs': 0.287537} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 708.413119] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 708.413804] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.413965] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.414304] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 708.414538] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a44ee53-0942-4ca8-afd6-179f1e0a053c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.418958] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Waiting for the task: (returnval){ [ 708.418958] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5204967b-828e-0f40-00a0-b763d5d5d60b" [ 708.418958] env[60175]: _type = "Task" [ 708.418958] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 708.426063] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5204967b-828e-0f40-00a0-b763d5d5d60b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 708.929889] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.930255] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 708.930381] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.449118] env[60175]: DEBUG nova.compute.manager [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Received event network-changed-9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 709.449338] env[60175]: DEBUG nova.compute.manager [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Refreshing instance network info cache due to event network-changed-9fc574b7-1869-4a09-8bd5-72d1381e6c73. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 709.449552] env[60175]: DEBUG oslo_concurrency.lockutils [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] Acquiring lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.449692] env[60175]: DEBUG oslo_concurrency.lockutils [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] Acquired lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.449880] env[60175]: DEBUG nova.network.neutron [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Refreshing network info cache for port 9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 709.766021] env[60175]: DEBUG nova.network.neutron [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Updated VIF entry in instance network info cache for port 9fc574b7-1869-4a09-8bd5-72d1381e6c73. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 709.766392] env[60175]: DEBUG nova.network.neutron [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Updating instance_info_cache with network_info: [{"id": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "address": "fa:16:3e:d8:76:5e", "network": {"id": "649a49bb-a50a-40b7-8692-8fbbc71ae710", "bridge": "br-int", "label": "tempest-ServersTestJSON-1016732347-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "58b9dd37c4a94f59a5afc2c931ee30a9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9fc574b7-18", "ovs_interfaceid": "9fc574b7-1869-4a09-8bd5-72d1381e6c73", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.778154] env[60175]: DEBUG oslo_concurrency.lockutils [req-05171230-0a2c-4391-9deb-af0dd62b24fe req-0e8ccd52-67c4-4acd-a1be-15eece544f61 service nova] Releasing lock "refresh_cache-068814dd-328c-48d1-b514-34eb43b0f2b1" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 743.950176] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 743.960489] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.960697] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.960863] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.961023] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 743.962530] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1597e6f7-ba82-4fc5-abb3-c69bba00d252 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.971435] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e9f3c2c-a4c2-4768-b347-55901a8de621 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.984946] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5f14962-e055-4529-a72f-793fc7b9cae8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.991496] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1843e10a-705c-4f67-aeee-d8abcb12b7ff {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.021290] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180706MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 744.021458] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.021654] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.087129] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance b5636e10-af08-49d3-a9b2-8122521a9e2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087308] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087439] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 72caf1e5-e894-4581-a95d-21dda85e11b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087562] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087682] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087801] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.087918] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.088043] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.088160] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 029d2099-2e55-4632-81b6-b59d6a20faab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.088275] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 744.098801] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.126513] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.137621] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 57a5dcae-6861-418a-a041-9cd5b7a43982 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.148163] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 6c94c59c-44ab-4cb9-8480-18e8a424993b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.158402] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.168034] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 71244679-78d6-4d49-b4b5-ef96fd313ae8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.178642] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance fb825c5f-bd66-40aa-8027-cb425f3b9b96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.189387] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance c76409ad-b0aa-4da6-ac83-58f617ec2588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.201940] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 63823a4b-97e0-48f9-9fb9-7c4fe3858343 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.211777] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 070d142d-6a47-49bc-a061-3101da79447a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 744.212108] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 744.212323] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=149GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 744.433715] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c6d8af-db12-4b06-92f2-a623576a636b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.442339] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07f049b2-aa15-4d7b-8094-f325c8c03172 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.471780] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b651168-5b74-4853-9cc8-39158eaf3a5d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.479054] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0661118d-3d3d-46a4-9cbe-1f3369ff9a66 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.491926] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.502357] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.515368] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 744.515548] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.494s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.510640] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.510991] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.510991] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 745.950592] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 746.949550] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 746.949816] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 746.949859] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 746.969582] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.969752] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.969885] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970026] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970197] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970325] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970444] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970589] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970677] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970790] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 746.970925] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 746.971463] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 746.971637] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 746.971788] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 746.971918] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 747.968206] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 751.121832] env[60175]: WARNING oslo_vmware.rw_handles [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 751.121832] env[60175]: ERROR oslo_vmware.rw_handles [ 751.122506] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 751.124089] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 751.124347] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Copying Virtual Disk [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/e5f04995-fde4-493b-8148-cd85ec15f29f/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 751.124634] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f6a51b05-f468-46b1-851b-ed2eaa5660c9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.137312] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Waiting for the task: (returnval){ [ 751.137312] env[60175]: value = "task-4292905" [ 751.137312] env[60175]: _type = "Task" [ 751.137312] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 751.148284] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Task: {'id': task-4292905, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 751.648393] env[60175]: DEBUG oslo_vmware.exceptions [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 751.648647] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.649200] env[60175]: ERROR nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.649200] env[60175]: Faults: ['InvalidArgument'] [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Traceback (most recent call last): [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] yield resources [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self.driver.spawn(context, instance, image_meta, [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self._fetch_image_if_missing(context, vi) [ 751.649200] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] image_cache(vi, tmp_image_ds_loc) [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] vm_util.copy_virtual_disk( [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] session._wait_for_task(vmdk_copy_task) [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return self.wait_for_task(task_ref) [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return evt.wait() [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] result = hub.switch() [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 751.649612] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return self.greenlet.switch() [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self.f(*self.args, **self.kw) [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] raise exceptions.translate_fault(task_info.error) [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Faults: ['InvalidArgument'] [ 751.650021] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] [ 751.650021] env[60175]: INFO nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Terminating instance [ 751.651113] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 751.651341] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 751.651590] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5861d804-ac8e-44ab-95b3-c44aa1f2f5b9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.653810] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 751.654011] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 751.654730] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5827396c-c4c6-4792-b185-4a5b320878fe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.661766] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 751.661972] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-889c6066-2a01-402c-a344-e63bfdd797db {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.664988] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 751.665215] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 751.665894] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99bb13b5-678d-4b82-bee8-179107a04740 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.670994] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for the task: (returnval){ [ 751.670994] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5231c504-c473-26d1-4f80-d6184e300916" [ 751.670994] env[60175]: _type = "Task" [ 751.670994] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 751.678611] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5231c504-c473-26d1-4f80-d6184e300916, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 751.755253] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 751.755492] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 751.755652] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Deleting the datastore file [datastore2] b5636e10-af08-49d3-a9b2-8122521a9e2c {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 751.755918] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23f3afbd-abc7-4849-9359-64e01c8c652b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.762650] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Waiting for the task: (returnval){ [ 751.762650] env[60175]: value = "task-4292907" [ 751.762650] env[60175]: _type = "Task" [ 751.762650] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 751.770303] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Task: {'id': task-4292907, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 752.182922] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 752.183206] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Creating directory with path [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 752.183404] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c01e906-fe73-4f00-af9d-570f82ef8802 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.197864] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Created directory with path [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 752.198008] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Fetch image to [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 752.198192] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 752.198926] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2791b87b-15ec-4e49-b077-9dc967441a3b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.205742] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9046821a-f6e6-4241-9f3d-bd42cd45616d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.215046] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8287306-b0d7-4086-ae49-2d0b27c04f80 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.245967] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31a7a381-67a5-422a-bf4a-1c744c3b6ad7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.252183] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f5b313ab-0823-4fdb-8678-9807a43537b8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.273128] env[60175]: DEBUG oslo_vmware.api [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Task: {'id': task-4292907, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069197} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 752.274658] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 752.274871] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 752.275091] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 752.275301] env[60175]: INFO nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Took 0.62 seconds to destroy the instance on the hypervisor. [ 752.277208] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 752.279344] env[60175]: DEBUG nova.compute.claims [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 752.279513] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.279723] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.327676] env[60175]: DEBUG oslo_vmware.rw_handles [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 752.384945] env[60175]: DEBUG oslo_vmware.rw_handles [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 752.385136] env[60175]: DEBUG oslo_vmware.rw_handles [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 752.586876] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7774d94a-1d20-43d1-a784-84c9a70e5d29 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.594348] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80d85efb-5feb-460c-a85c-388359fcff21 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.626219] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5cd7c30-972c-4264-9b79-b55993b041b4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.633386] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d1a337d-99c1-4492-88e3-0e21080b82fe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.646418] env[60175]: DEBUG nova.compute.provider_tree [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.655028] env[60175]: DEBUG nova.scheduler.client.report [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.672015] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.392s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.672597] env[60175]: ERROR nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.672597] env[60175]: Faults: ['InvalidArgument'] [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Traceback (most recent call last): [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self.driver.spawn(context, instance, image_meta, [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self._fetch_image_if_missing(context, vi) [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] image_cache(vi, tmp_image_ds_loc) [ 752.672597] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] vm_util.copy_virtual_disk( [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] session._wait_for_task(vmdk_copy_task) [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return self.wait_for_task(task_ref) [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return evt.wait() [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] result = hub.switch() [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] return self.greenlet.switch() [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 752.672957] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] self.f(*self.args, **self.kw) [ 752.673309] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 752.673309] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] raise exceptions.translate_fault(task_info.error) [ 752.673309] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.673309] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Faults: ['InvalidArgument'] [ 752.673309] env[60175]: ERROR nova.compute.manager [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] [ 752.673309] env[60175]: DEBUG nova.compute.utils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.674975] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Build of instance b5636e10-af08-49d3-a9b2-8122521a9e2c was re-scheduled: A specified parameter was not correct: fileType [ 752.674975] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 752.675396] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 752.675522] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 752.675676] env[60175]: DEBUG nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 752.675832] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 752.953027] env[60175]: DEBUG nova.network.neutron [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.963568] env[60175]: INFO nova.compute.manager [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] [instance: b5636e10-af08-49d3-a9b2-8122521a9e2c] Took 0.29 seconds to deallocate network for instance. [ 753.046315] env[60175]: INFO nova.scheduler.client.report [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Deleted allocations for instance b5636e10-af08-49d3-a9b2-8122521a9e2c [ 753.068706] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6673dde9-7e14-4de0-98da-cd8497c22221 tempest-ServersAdminNegativeTestJSON-1119189019 tempest-ServersAdminNegativeTestJSON-1119189019-project-member] Lock "b5636e10-af08-49d3-a9b2-8122521a9e2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.646s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.097265] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 753.145468] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.145712] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.147182] env[60175]: INFO nova.compute.claims [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 753.421041] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87d6ef7-8eda-458a-bbcd-d8674f869bac {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.429944] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df358e8-2c1e-4e61-9d3e-469ee517cdaa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.461065] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40f46cdc-826e-4e3b-b13b-405ebe06995b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.467601] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77f41a82-f5ee-47dd-8698-c311d12f988c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.480753] env[60175]: DEBUG nova.compute.provider_tree [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.489465] env[60175]: DEBUG nova.scheduler.client.report [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.504400] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.504860] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 753.534619] env[60175]: DEBUG nova.compute.utils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 753.536013] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 753.536565] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 753.546501] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 753.609256] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 753.629939] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 753.630259] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 753.630423] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 753.630609] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 753.630749] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 753.630893] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 753.631138] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 753.631321] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 753.631713] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 753.631713] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 753.631820] env[60175]: DEBUG nova.virt.hardware [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 753.632660] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de625da0-3739-472b-92c3-03e34f9c1d9f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.640582] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-163a4e83-4ed9-41c8-b92d-41ceac02f671 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.646908] env[60175]: DEBUG nova.policy [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a1b27a07ed01451683d91d3795f68ce4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ae96503b01a9442f96d122810ca18d88', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 754.117556] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Successfully created port: 60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 755.437014] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Successfully updated port: 60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 755.445962] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.446142] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquired lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.446296] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.514481] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.565727] env[60175]: DEBUG nova.compute.manager [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Received event network-vif-plugged-60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 755.565727] env[60175]: DEBUG oslo_concurrency.lockutils [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] Acquiring lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.565727] env[60175]: DEBUG oslo_concurrency.lockutils [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] Lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.565727] env[60175]: DEBUG oslo_concurrency.lockutils [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] Lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.566065] env[60175]: DEBUG nova.compute.manager [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] No waiting events found dispatching network-vif-plugged-60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 755.566065] env[60175]: WARNING nova.compute.manager [req-49dd7321-9831-4f9d-a9f1-1d10ff3ea310 req-c378d847-6657-497b-b99f-80705b020a73 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Received unexpected event network-vif-plugged-60673b65-811c-44b5-a021-c476b84db981 for instance with vm_state building and task_state spawning. [ 755.783505] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Updating instance_info_cache with network_info: [{"id": "60673b65-811c-44b5-a021-c476b84db981", "address": "fa:16:3e:07:c6:60", "network": {"id": "912e98c5-112e-4372-956a-a0c21ce4a2a4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2092124176-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae96503b01a9442f96d122810ca18d88", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "713e54d5-283f-493d-b003-f13182deaf7b", "external-id": "cl2-zone-703", "segmentation_id": 703, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60673b65-81", "ovs_interfaceid": "60673b65-811c-44b5-a021-c476b84db981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.795940] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Releasing lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.796291] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance network_info: |[{"id": "60673b65-811c-44b5-a021-c476b84db981", "address": "fa:16:3e:07:c6:60", "network": {"id": "912e98c5-112e-4372-956a-a0c21ce4a2a4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2092124176-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae96503b01a9442f96d122810ca18d88", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "713e54d5-283f-493d-b003-f13182deaf7b", "external-id": "cl2-zone-703", "segmentation_id": 703, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60673b65-81", "ovs_interfaceid": "60673b65-811c-44b5-a021-c476b84db981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 755.796694] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:07:c6:60', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '713e54d5-283f-493d-b003-f13182deaf7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '60673b65-811c-44b5-a021-c476b84db981', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 755.804274] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Creating folder: Project (ae96503b01a9442f96d122810ca18d88). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 755.804828] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-df4ad2fe-d6da-43d3-9e30-3440e11e632e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.816757] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Created folder: Project (ae96503b01a9442f96d122810ca18d88) in parent group-v845475. [ 755.816757] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Creating folder: Instances. Parent ref: group-v845523. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 755.816757] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24c3f05d-5e7a-443d-ae18-8fd192047b39 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.825972] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Created folder: Instances in parent group-v845523. [ 755.825972] env[60175]: DEBUG oslo.service.loopingcall [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 755.826172] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 755.826534] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3ba84c6b-0edb-400f-9306-1be246c7bfed {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.846061] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 755.846061] env[60175]: value = "task-4292910" [ 755.846061] env[60175]: _type = "Task" [ 755.846061] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 755.853309] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292910, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 756.146964] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.355350] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292910, 'name': CreateVM_Task, 'duration_secs': 0.289331} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 756.355530] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 756.356210] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.356370] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.356687] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 756.356921] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cfc02218-5554-4a4c-b397-06394499bee4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.361725] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Waiting for the task: (returnval){ [ 756.361725] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e6912d-feb8-f613-62b3-c32835d0ee71" [ 756.361725] env[60175]: _type = "Task" [ 756.361725] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 756.369789] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e6912d-feb8-f613-62b3-c32835d0ee71, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 756.874267] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.874543] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 756.874742] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.023139] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "72caf1e5-e894-4581-a95d-21dda85e11b0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.589446] env[60175]: DEBUG nova.compute.manager [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Received event network-changed-60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 757.589543] env[60175]: DEBUG nova.compute.manager [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Refreshing instance network info cache due to event network-changed-60673b65-811c-44b5-a021-c476b84db981. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 757.589701] env[60175]: DEBUG oslo_concurrency.lockutils [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] Acquiring lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.589844] env[60175]: DEBUG oslo_concurrency.lockutils [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] Acquired lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 757.590024] env[60175]: DEBUG nova.network.neutron [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Refreshing network info cache for port 60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 758.268054] env[60175]: DEBUG nova.network.neutron [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Updated VIF entry in instance network info cache for port 60673b65-811c-44b5-a021-c476b84db981. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 758.268054] env[60175]: DEBUG nova.network.neutron [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Updating instance_info_cache with network_info: [{"id": "60673b65-811c-44b5-a021-c476b84db981", "address": "fa:16:3e:07:c6:60", "network": {"id": "912e98c5-112e-4372-956a-a0c21ce4a2a4", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2092124176-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae96503b01a9442f96d122810ca18d88", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "713e54d5-283f-493d-b003-f13182deaf7b", "external-id": "cl2-zone-703", "segmentation_id": 703, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap60673b65-81", "ovs_interfaceid": "60673b65-811c-44b5-a021-c476b84db981", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.281529] env[60175]: DEBUG oslo_concurrency.lockutils [req-40175d0e-8b99-4c66-9517-09d699e67abd req-be36a039-9a06-44a0-a929-34a5ff5ff402 service nova] Releasing lock "refresh_cache-500d78f9-ee0c-4620-9936-1a9b4f4fc09a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.140337] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.365126] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "da3eaeea-ce26-40eb-af8b-8857f927e431" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.630293] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.773913] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.916604] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "81af879b-3bc3-4aff-a99d-98d3aba73512" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.167175] env[60175]: WARNING oslo_vmware.rw_handles [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 799.167175] env[60175]: ERROR oslo_vmware.rw_handles [ 799.167880] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 799.169573] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 799.169871] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Copying Virtual Disk [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/f4854cbb-8866-47a9-b71e-9ea4605a4b77/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 799.170205] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d6e565ff-cdb3-4455-9c58-e5488a9f41f4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.178670] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for the task: (returnval){ [ 799.178670] env[60175]: value = "task-4292911" [ 799.178670] env[60175]: _type = "Task" [ 799.178670] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 799.186515] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Task: {'id': task-4292911, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 799.690049] env[60175]: DEBUG oslo_vmware.exceptions [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 799.690279] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 799.690867] env[60175]: ERROR nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 799.690867] env[60175]: Faults: ['InvalidArgument'] [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Traceback (most recent call last): [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] yield resources [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self.driver.spawn(context, instance, image_meta, [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self._fetch_image_if_missing(context, vi) [ 799.690867] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] image_cache(vi, tmp_image_ds_loc) [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] vm_util.copy_virtual_disk( [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] session._wait_for_task(vmdk_copy_task) [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return self.wait_for_task(task_ref) [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return evt.wait() [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] result = hub.switch() [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 799.691278] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return self.greenlet.switch() [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self.f(*self.args, **self.kw) [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] raise exceptions.translate_fault(task_info.error) [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Faults: ['InvalidArgument'] [ 799.691733] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] [ 799.691733] env[60175]: INFO nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Terminating instance [ 799.693321] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 799.693321] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 799.693321] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4e2702ed-c17c-4844-9223-77ac42d6cd2f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.695528] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 799.695718] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 799.696434] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e38600ec-5b79-45c5-bc9d-9b9140e3bf67 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.703673] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 799.703880] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7541f531-885a-4dbe-aa9a-a483501697dc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.706069] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 799.706244] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 799.707194] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2a4134e-e8cb-42ef-80b3-5561b4a02566 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.711750] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for the task: (returnval){ [ 799.711750] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526054a1-ae76-0462-b495-24555feae42f" [ 799.711750] env[60175]: _type = "Task" [ 799.711750] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 799.718598] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526054a1-ae76-0462-b495-24555feae42f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 799.780081] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 799.780081] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 799.780266] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Deleting the datastore file [datastore2] 72caf1e5-e894-4581-a95d-21dda85e11b0 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 799.780500] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c320ba12-b921-4097-ad46-c741ee668569 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.786681] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for the task: (returnval){ [ 799.786681] env[60175]: value = "task-4292913" [ 799.786681] env[60175]: _type = "Task" [ 799.786681] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 799.794487] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Task: {'id': task-4292913, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 800.222838] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 800.223165] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Creating directory with path [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 800.223409] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3b40358-f158-4d03-b306-cdf1ebd60099 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.235256] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Created directory with path [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 800.235420] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Fetch image to [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 800.235573] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 800.236338] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abeb8a23-4e24-4e7d-b11e-4fe40472d540 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.242864] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01bee5bc-229c-4b0a-b75d-f9e5c69ade3a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.251963] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13aa6e48-f410-441d-b025-506ee7d5759d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.283242] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df6700ab-4082-49b9-9f99-6137a158cae3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.292257] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1be8e28a-a957-485d-8e65-acafdc052822 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.296757] env[60175]: DEBUG oslo_vmware.api [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Task: {'id': task-4292913, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077537} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 800.297353] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 800.297560] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 800.297751] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 800.297931] env[60175]: INFO nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 800.300186] env[60175]: DEBUG nova.compute.claims [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 800.300359] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.300567] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.316877] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 800.366562] env[60175]: DEBUG oslo_vmware.rw_handles [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 800.424481] env[60175]: DEBUG oslo_vmware.rw_handles [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 800.424680] env[60175]: DEBUG oslo_vmware.rw_handles [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 800.669724] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bfb8d26-2ca5-49b7-bd66-d1193c58cb9a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.678422] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e981d3b4-5364-44e8-ae64-dde595e028d7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.709263] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3196620-f52d-4536-8563-7796463639c9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.716984] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a408afa9-8368-49b9-bd9f-2039fd5fde2e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.730724] env[60175]: DEBUG nova.compute.provider_tree [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 800.739433] env[60175]: DEBUG nova.scheduler.client.report [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 800.754647] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.454s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.755279] env[60175]: ERROR nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 800.755279] env[60175]: Faults: ['InvalidArgument'] [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Traceback (most recent call last): [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self.driver.spawn(context, instance, image_meta, [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self._fetch_image_if_missing(context, vi) [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] image_cache(vi, tmp_image_ds_loc) [ 800.755279] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] vm_util.copy_virtual_disk( [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] session._wait_for_task(vmdk_copy_task) [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return self.wait_for_task(task_ref) [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return evt.wait() [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] result = hub.switch() [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] return self.greenlet.switch() [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 800.755669] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] self.f(*self.args, **self.kw) [ 800.756255] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 800.756255] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] raise exceptions.translate_fault(task_info.error) [ 800.756255] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 800.756255] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Faults: ['InvalidArgument'] [ 800.756255] env[60175]: ERROR nova.compute.manager [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] [ 800.756255] env[60175]: DEBUG nova.compute.utils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 800.757727] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Build of instance 72caf1e5-e894-4581-a95d-21dda85e11b0 was re-scheduled: A specified parameter was not correct: fileType [ 800.757727] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 800.758143] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 800.758339] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 800.758564] env[60175]: DEBUG nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 800.758742] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 801.226805] env[60175]: DEBUG nova.network.neutron [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.237348] env[60175]: INFO nova.compute.manager [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Took 0.48 seconds to deallocate network for instance. [ 801.332393] env[60175]: INFO nova.scheduler.client.report [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Deleted allocations for instance 72caf1e5-e894-4581-a95d-21dda85e11b0 [ 801.363692] env[60175]: DEBUG oslo_concurrency.lockutils [None req-68ff1a30-ac0f-43d9-9d78-0f218d8105a0 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.837s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.364838] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 44.342s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 801.365067] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Acquiring lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 801.365274] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 801.365436] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.367409] env[60175]: INFO nova.compute.manager [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Terminating instance [ 801.369209] env[60175]: DEBUG nova.compute.manager [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 801.369511] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 801.370073] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cae19dcb-2d11-4649-a73f-111df1db7b03 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.376195] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 801.382854] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ba02f5-057f-4b1f-b013-ebb7d6ac4dba {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.413505] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 72caf1e5-e894-4581-a95d-21dda85e11b0 could not be found. [ 801.413720] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 801.413907] env[60175]: INFO nova.compute.manager [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 801.414215] env[60175]: DEBUG oslo.service.loopingcall [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 801.414449] env[60175]: DEBUG nova.compute.manager [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 801.414546] env[60175]: DEBUG nova.network.neutron [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 801.434246] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 801.434608] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 801.436937] env[60175]: INFO nova.compute.claims [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 801.448367] env[60175]: DEBUG nova.network.neutron [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.471797] env[60175]: INFO nova.compute.manager [-] [instance: 72caf1e5-e894-4581-a95d-21dda85e11b0] Took 0.06 seconds to deallocate network for instance. [ 801.622860] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fbbe5c89-fc3e-41ba-bcde-618e0e0da238 tempest-ImagesNegativeTestJSON-1138884532 tempest-ImagesNegativeTestJSON-1138884532-project-member] Lock "72caf1e5-e894-4581-a95d-21dda85e11b0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.258s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.755011] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e1c2379-b8ea-4516-9099-825ef9fbc04d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.764452] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-746e1c72-57a0-4041-b929-1a68619b24e5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.794691] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8961053-2058-496d-aa29-865ea8a2b77f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.802415] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57ed8255-dde8-4c0c-bbce-b69a92331182 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.815881] env[60175]: DEBUG nova.compute.provider_tree [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 801.824315] env[60175]: DEBUG nova.scheduler.client.report [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 801.838025] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.838394] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 801.870321] env[60175]: DEBUG nova.compute.utils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 801.871886] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 801.872109] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 801.880472] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 801.911263] env[60175]: INFO nova.virt.block_device [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Booting with volume 48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0 at /dev/sda [ 801.957421] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eddb5da0-1a67-489a-9af7-ee433f9c8c76 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.965951] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e2e2ce-be07-4cc6-af20-45aa8cf0bbfa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.977734] env[60175]: DEBUG nova.policy [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc147e2d92aa41bc9c7757eaa9adb7a4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30c26c3c4591499e82e430e68f2889ef', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 801.997120] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a53da52d-6bf0-417a-8391-8b46c82d44fe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.005551] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ab78cf9-3a5d-4e62-8aed-2270c5fd52d0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.033589] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1082f99-d02a-4038-bfb8-bf6931939301 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.039606] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd591bc5-f931-48d4-853c-bcd23248aa7b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.052775] env[60175]: DEBUG nova.virt.block_device [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updating existing volume attachment record: 2f835c19-72af-4392-bef3-596395255df5 {{(pid=60175) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 802.270263] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 802.270973] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 802.271405] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 802.271666] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 802.271957] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 802.272237] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 802.272507] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 802.272876] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 802.273162] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 802.273511] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 802.273756] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 802.273951] env[60175]: DEBUG nova.virt.hardware [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 802.275068] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6bf80fa-f9c6-4a4d-abca-5a4d85f6ce4c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.283968] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f6e4d2-086d-45aa-9bbd-f21b425ec6c7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.950195] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.950380] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Cleaning up deleted instances {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 802.965353] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] There are 0 instances to clean {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 802.965583] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.965719] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Cleaning up deleted instances with incomplete migration {{(pid=60175) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 802.980762] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 803.081905] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Successfully created port: 29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 804.591880] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Successfully updated port: 29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 804.602915] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 804.604689] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquired lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 804.604764] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 804.625404] env[60175]: DEBUG nova.compute.manager [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Received event network-vif-plugged-29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 804.625520] env[60175]: DEBUG oslo_concurrency.lockutils [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] Acquiring lock "39a2035c-bb7b-4837-b556-e8bb38ffb514-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.625716] env[60175]: DEBUG oslo_concurrency.lockutils [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] Lock "39a2035c-bb7b-4837-b556-e8bb38ffb514-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 804.625927] env[60175]: DEBUG oslo_concurrency.lockutils [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] Lock "39a2035c-bb7b-4837-b556-e8bb38ffb514-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.626048] env[60175]: DEBUG nova.compute.manager [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] No waiting events found dispatching network-vif-plugged-29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 804.626316] env[60175]: WARNING nova.compute.manager [req-9b113eee-aaff-4a76-9bec-4683cb9e6533 req-e250bc6d-6b07-4728-a80f-561c661562ce service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Received unexpected event network-vif-plugged-29e209ef-b352-49cc-ac79-37612fe95e01 for instance with vm_state building and task_state spawning. [ 804.661527] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 804.953054] env[60175]: DEBUG nova.network.neutron [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updating instance_info_cache with network_info: [{"id": "29e209ef-b352-49cc-ac79-37612fe95e01", "address": "fa:16:3e:82:02:7e", "network": {"id": "dc028a67-b3aa-4e09-aaf8-b6922a1b245a", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-20061545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30c26c3c4591499e82e430e68f2889ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29e209ef-b3", "ovs_interfaceid": "29e209ef-b352-49cc-ac79-37612fe95e01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 804.963238] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Releasing lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 804.963503] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Instance network_info: |[{"id": "29e209ef-b352-49cc-ac79-37612fe95e01", "address": "fa:16:3e:82:02:7e", "network": {"id": "dc028a67-b3aa-4e09-aaf8-b6922a1b245a", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-20061545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30c26c3c4591499e82e430e68f2889ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29e209ef-b3", "ovs_interfaceid": "29e209ef-b352-49cc-ac79-37612fe95e01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 804.963862] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:82:02:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '04ccbc7a-cf8d-4ea2-8411-291a1e27df7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '29e209ef-b352-49cc-ac79-37612fe95e01', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 804.971853] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Creating folder: Project (30c26c3c4591499e82e430e68f2889ef). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.972410] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dede862a-117c-4ea1-8c09-087ea25ab184 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.985892] env[60175]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 804.986071] env[60175]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60175) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 804.986399] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.986626] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Folder already exists: Project (30c26c3c4591499e82e430e68f2889ef). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 804.986862] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Creating folder: Instances. Parent ref: group-v845513. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.987102] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.987452] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eece543e-b1e6-4216-91d1-5bb4ea550102 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.996650] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Created folder: Instances in parent group-v845513. [ 804.996862] env[60175]: DEBUG oslo.service.loopingcall [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 804.997040] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 804.997227] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5db22ba0-3bed-4b6e-bc18-facec126de04 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.015936] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 805.015936] env[60175]: value = "task-4292916" [ 805.015936] env[60175]: _type = "Task" [ 805.015936] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 805.023421] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292916, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 805.525582] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292916, 'name': CreateVM_Task, 'duration_secs': 0.3268} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 805.525758] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 805.526423] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'guest_format': None, 'attachment_id': '2f835c19-72af-4392-bef3-596395255df5', 'device_type': None, 'disk_bus': None, 'mount_device': '/dev/sda', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-845516', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'name': 'volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '39a2035c-bb7b-4837-b556-e8bb38ffb514', 'attached_at': '', 'detached_at': '', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'serial': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0'}, 'boot_index': 0, 'delete_on_termination': True, 'volume_type': None}], 'swap': None} {{(pid=60175) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 805.526646] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Root volume attach. Driver type: vmdk {{(pid=60175) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 805.527421] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a5f25f8-32c8-4b5e-9337-20868ada422a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.535237] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6577ecab-7a59-4358-9fd9-8eef545404d7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.541227] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51cbfdd7-5733-42bd-a94e-443b07d8d441 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.548476] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-bad1d411-5d01-4608-b9af-0dcfeda1f1cd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.556716] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 805.556716] env[60175]: value = "task-4292917" [ 805.556716] env[60175]: _type = "Task" [ 805.556716] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 805.563461] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292917, 'name': RelocateVM_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 805.949953] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 805.961031] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.961215] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.961389] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.961694] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 805.962627] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2b9e99-71c8-4c1b-9973-0484d6919da4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.970646] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faa407dd-d0a5-420f-a7a8-f4ac4a1c4405 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.988260] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a67a05f8-5de2-4f85-8975-e97920e6aee3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.992982] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10ad81ee-fe94-44ba-b3a1-1c8ebd36baad {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.031210] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180729MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 806.031366] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 806.031575] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 806.068078] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292917, 'name': RelocateVM_Task, 'duration_secs': 0.352823} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 806.068078] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Volume attach. Driver type: vmdk {{(pid=60175) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 806.068078] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-845516', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'name': 'volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '39a2035c-bb7b-4837-b556-e8bb38ffb514', 'attached_at': '', 'detached_at': '', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'serial': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0'} {{(pid=60175) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 806.068862] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2532aeb2-b92e-4d87-8e61-765e7e671809 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.091731] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d26444-2b42-41a9-896b-7611f0210a3d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.117366] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Reconfiguring VM instance instance-0000000f to attach disk [datastore2] volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0/volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0.vmdk or device None with type thin {{(pid=60175) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 806.118411] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-71a7c124-7e2c-40db-b59a-dc8945f6be9e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.143038] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 806.143038] env[60175]: value = "task-4292918" [ 806.143038] env[60175]: _type = "Task" [ 806.143038] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 806.153654] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292918, 'name': ReconfigVM_Task} progress is 6%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 806.156218] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156359] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156481] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156601] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156718] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156834] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.156947] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 029d2099-2e55-4632-81b6-b59d6a20faab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.157069] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.157886] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.158081] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 806.174160] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 57a5dcae-6861-418a-a041-9cd5b7a43982 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.185675] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 6c94c59c-44ab-4cb9-8480-18e8a424993b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.195899] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.206427] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 71244679-78d6-4d49-b4b5-ef96fd313ae8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.218306] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance fb825c5f-bd66-40aa-8027-cb425f3b9b96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.228989] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance c76409ad-b0aa-4da6-ac83-58f617ec2588 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.241097] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 63823a4b-97e0-48f9-9fb9-7c4fe3858343 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.251691] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 070d142d-6a47-49bc-a061-3101da79447a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 806.251967] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 806.252186] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=149GB used_disk=9GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 806.518097] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e03d2c76-b96d-4386-973d-526893141c9d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.526692] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18aa159c-afdb-4cfd-b5ce-873c89f4bec9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.558696] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66821a6d-53fa-4bf0-9791-918b68dfc0c6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.565130] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2d1ec22-2bf3-4eeb-9d0c-c8c165137514 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.583659] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 806.594698] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 806.608341] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 806.608533] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 806.652947] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292918, 'name': ReconfigVM_Task, 'duration_secs': 0.243489} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 806.653271] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Reconfigured VM instance instance-0000000f to attach disk [datastore2] volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0/volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0.vmdk or device None with type thin {{(pid=60175) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 806.657937] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-cad2d069-f9ed-4630-b682-213393066bdd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.674730] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 806.674730] env[60175]: value = "task-4292919" [ 806.674730] env[60175]: _type = "Task" [ 806.674730] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 806.684055] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292919, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 806.960907] env[60175]: DEBUG nova.compute.manager [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Received event network-changed-29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 806.960907] env[60175]: DEBUG nova.compute.manager [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Refreshing instance network info cache due to event network-changed-29e209ef-b352-49cc-ac79-37612fe95e01. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 806.960907] env[60175]: DEBUG oslo_concurrency.lockutils [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] Acquiring lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 806.960907] env[60175]: DEBUG oslo_concurrency.lockutils [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] Acquired lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 806.960907] env[60175]: DEBUG nova.network.neutron [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Refreshing network info cache for port 29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 807.184364] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292919, 'name': ReconfigVM_Task, 'duration_secs': 0.118537} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 807.184679] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-845516', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'name': 'volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '39a2035c-bb7b-4837-b556-e8bb38ffb514', 'attached_at': '', 'detached_at': '', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'serial': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0'} {{(pid=60175) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 807.185300] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-08e07f27-8357-4d2e-b462-043e2286c3da {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.191614] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 807.191614] env[60175]: value = "task-4292920" [ 807.191614] env[60175]: _type = "Task" [ 807.191614] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 807.199472] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292920, 'name': Rename_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 807.444954] env[60175]: DEBUG nova.network.neutron [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updated VIF entry in instance network info cache for port 29e209ef-b352-49cc-ac79-37612fe95e01. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 807.444954] env[60175]: DEBUG nova.network.neutron [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updating instance_info_cache with network_info: [{"id": "29e209ef-b352-49cc-ac79-37612fe95e01", "address": "fa:16:3e:82:02:7e", "network": {"id": "dc028a67-b3aa-4e09-aaf8-b6922a1b245a", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-20061545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30c26c3c4591499e82e430e68f2889ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29e209ef-b3", "ovs_interfaceid": "29e209ef-b352-49cc-ac79-37612fe95e01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.461235] env[60175]: DEBUG oslo_concurrency.lockutils [req-d495acd6-70c8-4a6e-9116-70649c38dcfd req-718fd6b3-9b84-48cd-bbb1-4e68f0fd8563 service nova] Releasing lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 807.609556] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.610882] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 807.610882] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 807.637756] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.637933] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.638367] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.638609] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.638811] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.638995] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.639189] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.639366] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.639539] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.640977] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 807.640977] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 807.640977] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.702721] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292920, 'name': Rename_Task, 'duration_secs': 0.11319} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 807.703088] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Powering on the VM {{(pid=60175) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 807.703389] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-615beec6-2f58-472e-a695-278c637b45cd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.710162] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 807.710162] env[60175]: value = "task-4292921" [ 807.710162] env[60175]: _type = "Task" [ 807.710162] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 807.718499] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292921, 'name': PowerOnVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 807.949546] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.949822] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.949971] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.950130] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 808.220902] env[60175]: DEBUG oslo_vmware.api [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292921, 'name': PowerOnVM_Task, 'duration_secs': 0.43067} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 808.221223] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Powered on the VM {{(pid=60175) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 808.221425] env[60175]: INFO nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Took 5.95 seconds to spawn the instance on the hypervisor. [ 808.221699] env[60175]: DEBUG nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Checking state {{(pid=60175) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 808.222495] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c6b883-fa45-445f-82d9-c4e762f2f3f3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.277550] env[60175]: INFO nova.compute.manager [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Took 6.86 seconds to build instance. [ 808.289796] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2bced2d4-7a7b-4d50-baed-bb68588d03ed tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "39a2035c-bb7b-4837-b556-e8bb38ffb514" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 163.980s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.299632] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 808.425261] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.425508] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.429603] env[60175]: INFO nova.compute.claims [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 808.509598] env[60175]: DEBUG nova.scheduler.client.report [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Refreshing inventories for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 808.527374] env[60175]: DEBUG nova.scheduler.client.report [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Updating ProviderTree inventory for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 808.528034] env[60175]: DEBUG nova.compute.provider_tree [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Updating inventory in ProviderTree for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 808.539643] env[60175]: DEBUG nova.scheduler.client.report [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Refreshing aggregate associations for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e, aggregates: None {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 808.560272] env[60175]: DEBUG nova.scheduler.client.report [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Refreshing trait associations for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 808.844487] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "67cfe7ba-4590-451b-9e1a-340977b597a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 808.844714] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 808.849186] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e866df58-ef2a-4a6c-a903-d424c6df1aed {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.856883] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa2ec42-1640-4095-9232-00cf0ee76639 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.890408] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac92fd1f-10e8-4352-b1b4-d59d979909d2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.897837] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1513afee-c8a8-4da7-9c6a-aaf4276a8403 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.911908] env[60175]: DEBUG nova.compute.provider_tree [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 808.925035] env[60175]: DEBUG nova.scheduler.client.report [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 808.941300] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.516s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 808.941772] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 808.950179] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 808.979631] env[60175]: DEBUG nova.compute.utils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 808.979631] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Not allocating networking since 'none' was specified. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 808.990305] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 809.057934] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 809.080631] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 809.080888] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 809.081089] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 809.081230] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 809.081370] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 809.081510] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 809.081727] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 809.081896] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 809.082070] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 809.082229] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 809.082397] env[60175]: DEBUG nova.virt.hardware [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 809.083257] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e6ee2c-e204-4d64-a040-1ae2be674757 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.090808] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-808ac938-be07-459b-b5e8-829be0d3d81d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.104993] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance VIF info [] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 809.110502] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Creating folder: Project (2c0ae2f740df40b6a1987a5eb2d51803). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 809.110827] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-02174bf7-4d63-42da-ac65-9f751f54532d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.120753] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Created folder: Project (2c0ae2f740df40b6a1987a5eb2d51803) in parent group-v845475. [ 809.120984] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Creating folder: Instances. Parent ref: group-v845528. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 809.121284] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f220e248-8133-4f6c-b18a-c8ad46e88013 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.130288] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Created folder: Instances in parent group-v845528. [ 809.130506] env[60175]: DEBUG oslo.service.loopingcall [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 809.130732] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 809.130891] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-23fe06ec-b2b2-4251-b226-7a65aa3b755c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.148965] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 809.148965] env[60175]: value = "task-4292924" [ 809.148965] env[60175]: _type = "Task" [ 809.148965] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 809.156313] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292924, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.660542] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292924, 'name': CreateVM_Task, 'duration_secs': 0.24334} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 809.660845] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 809.661399] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 809.661626] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.662025] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 809.662458] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ff88020e-65d4-46c3-a873-7a712e2d040e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.667483] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Waiting for the task: (returnval){ [ 809.667483] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52aa8eac-5aa2-6fa9-058a-04446421f86a" [ 809.667483] env[60175]: _type = "Task" [ 809.667483] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 809.675617] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52aa8eac-5aa2-6fa9-058a-04446421f86a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 810.178641] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.179105] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 810.179483] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.378820] env[60175]: DEBUG nova.compute.manager [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Received event network-changed-29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 810.378820] env[60175]: DEBUG nova.compute.manager [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Refreshing instance network info cache due to event network-changed-29e209ef-b352-49cc-ac79-37612fe95e01. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 810.378907] env[60175]: DEBUG oslo_concurrency.lockutils [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] Acquiring lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.379139] env[60175]: DEBUG oslo_concurrency.lockutils [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] Acquired lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.379358] env[60175]: DEBUG nova.network.neutron [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Refreshing network info cache for port 29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 810.877922] env[60175]: DEBUG nova.network.neutron [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updated VIF entry in instance network info cache for port 29e209ef-b352-49cc-ac79-37612fe95e01. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 810.878412] env[60175]: DEBUG nova.network.neutron [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Updating instance_info_cache with network_info: [{"id": "29e209ef-b352-49cc-ac79-37612fe95e01", "address": "fa:16:3e:82:02:7e", "network": {"id": "dc028a67-b3aa-4e09-aaf8-b6922a1b245a", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-20061545-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30c26c3c4591499e82e430e68f2889ef", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29e209ef-b3", "ovs_interfaceid": "29e209ef-b352-49cc-ac79-37612fe95e01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.896788] env[60175]: DEBUG oslo_concurrency.lockutils [req-f271023b-cc1f-42c1-9dc0-da7d718155a6 req-3e509554-fb29-429f-b3d2-2e3a944d39d5 service nova] Releasing lock "refresh_cache-39a2035c-bb7b-4837-b556-e8bb38ffb514" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 826.067470] env[60175]: INFO nova.compute.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Rebuilding instance [ 826.100074] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lazy-loading 'trusted_certs' on Instance uuid 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 826.113018] env[60175]: DEBUG nova.compute.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Checking state {{(pid=60175) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 826.113018] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cf60ddd-b16e-495b-be0d-675546b5cd48 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.153374] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lazy-loading 'pci_requests' on Instance uuid 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 826.162316] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lazy-loading 'pci_devices' on Instance uuid 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 826.171762] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lazy-loading 'resources' on Instance uuid 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 826.178515] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lazy-loading 'migration_context' on Instance uuid 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 826.185731] env[60175]: DEBUG nova.objects.instance [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Trying to apply a migration context that does not seem to be set for this instance {{(pid=60175) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 826.189026] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Powering off the VM {{(pid=60175) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 826.189026] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-847cbcec-dc1a-4819-b906-0c7586709b6d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.194188] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 826.194188] env[60175]: value = "task-4292925" [ 826.194188] env[60175]: _type = "Task" [ 826.194188] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 826.202764] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292925, 'name': PowerOffVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 826.705635] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292925, 'name': PowerOffVM_Task, 'duration_secs': 0.171913} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 826.705909] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Powered off the VM {{(pid=60175) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 826.706658] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Powering off the VM {{(pid=60175) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 826.706912] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-83429810-7ddc-428a-9343-a394cfa3d702 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.714313] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 826.714313] env[60175]: value = "task-4292926" [ 826.714313] env[60175]: _type = "Task" [ 826.714313] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 826.721232] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292926, 'name': PowerOffVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 827.224212] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] VM already powered off {{(pid=60175) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 827.224554] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Volume detach. Driver type: vmdk {{(pid=60175) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 827.224600] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-845516', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'name': 'volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '39a2035c-bb7b-4837-b556-e8bb38ffb514', 'attached_at': '', 'detached_at': '', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'serial': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0'} {{(pid=60175) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 827.225333] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5371e422-8e66-42b1-a065-dd1912ed9bc1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.242987] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0639798d-612c-439f-b6ac-aac9917848bc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.248887] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fcc75a9-23b1-41a6-b045-ce654dfa85d8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.267197] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd2e6a4d-8050-45d0-8fa1-af4348a4a9ef {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.281287] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] The volume has not been displaced from its original location: [datastore2] volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0/volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0.vmdk. No consolidation needed. {{(pid=60175) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 827.286415] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Reconfiguring VM instance instance-0000000f to detach disk 2000 {{(pid=60175) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 827.286698] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4e4b517b-13a0-4727-825a-9dd56a290076 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.303654] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 827.303654] env[60175]: value = "task-4292927" [ 827.303654] env[60175]: _type = "Task" [ 827.303654] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 827.310902] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292927, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 827.813082] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292927, 'name': ReconfigVM_Task, 'duration_secs': 0.41447} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 827.813382] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Reconfigured VM instance instance-0000000f to detach disk 2000 {{(pid=60175) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 827.817912] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e40d37fc-e24d-4bde-8f48-4f5e301a3baa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.833495] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 827.833495] env[60175]: value = "task-4292928" [ 827.833495] env[60175]: _type = "Task" [ 827.833495] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 827.842036] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292928, 'name': ReconfigVM_Task} progress is 6%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 828.343623] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292928, 'name': ReconfigVM_Task, 'duration_secs': 0.282004} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 828.343946] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-845516', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'name': 'volume-48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '39a2035c-bb7b-4837-b556-e8bb38ffb514', 'attached_at': '', 'detached_at': '', 'volume_id': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0', 'serial': '48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0'} {{(pid=60175) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 828.344214] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 828.344946] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0992c4a1-bd96-41c2-bdce-67bd6d0ad5c8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.351182] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 828.351380] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f05887b7-f477-4fb1-82b2-69cccab8e56d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.429202] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 828.429495] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 828.429736] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Deleting the datastore file [datastore2] 39a2035c-bb7b-4837-b556-e8bb38ffb514 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 828.430050] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4e0faa29-c68e-49f6-aad8-2ac700d09a52 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.436307] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Waiting for the task: (returnval){ [ 828.436307] env[60175]: value = "task-4292930" [ 828.436307] env[60175]: _type = "Task" [ 828.436307] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 828.444260] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292930, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 828.946667] env[60175]: DEBUG oslo_vmware.api [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Task: {'id': task-4292930, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.294327} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 828.946667] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 828.946667] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 828.946837] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 828.995546] env[60175]: DEBUG nova.virt.vmwareapi.volumeops [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Volume detach. Driver type: vmdk {{(pid=60175) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 828.995863] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0eb645f4-39ec-4bcf-8a14-5020c6805e89 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.004285] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3386a1e-2771-4d85-a97c-5021cb8876bd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.034346] env[60175]: ERROR nova.compute.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Failed to detach volume 48d0eff8-2973-4ae3-a6b0-4a0f5971f8e0 from /dev/sda: nova.exception.InstanceNotFound: Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 could not be found. [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Traceback (most recent call last): [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self.driver.rebuild(**kwargs) [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise NotImplementedError() [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] NotImplementedError [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] During handling of the above exception, another exception occurred: [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Traceback (most recent call last): [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 829.034346] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self.driver.detach_volume(context, old_connection_info, [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] return self._volumeops.detach_volume(connection_info, instance) [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._detach_volume_vmdk(connection_info, instance) [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] stable_ref.fetch_moref(session) [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise exception.InstanceNotFound(instance_id=self._uuid) [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] nova.exception.InstanceNotFound: Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 could not be found. [ 829.034783] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.153079] env[60175]: DEBUG nova.compute.utils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Build of instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 aborted: Failed to rebuild volume backed instance. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 829.155673] env[60175]: ERROR nova.compute.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 aborted: Failed to rebuild volume backed instance. [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Traceback (most recent call last): [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self.driver.rebuild(**kwargs) [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise NotImplementedError() [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] NotImplementedError [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] During handling of the above exception, another exception occurred: [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Traceback (most recent call last): [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 829.155673] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._detach_root_volume(context, instance, root_bdm) [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] with excutils.save_and_reraise_exception(): [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self.force_reraise() [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise self.value [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self.driver.detach_volume(context, old_connection_info, [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] return self._volumeops.detach_volume(connection_info, instance) [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 829.156160] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._detach_volume_vmdk(connection_info, instance) [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] stable_ref.fetch_moref(session) [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise exception.InstanceNotFound(instance_id=self._uuid) [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] nova.exception.InstanceNotFound: Instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 could not be found. [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] During handling of the above exception, another exception occurred: [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Traceback (most recent call last): [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 10738, in _error_out_instance_on_exception [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] yield [ 829.156597] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._do_rebuild_instance_with_claim( [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._do_rebuild_instance( [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._rebuild_default_impl(**kwargs) [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] self._rebuild_volume_backed_instance( [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] raise exception.BuildAbortException( [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] nova.exception.BuildAbortException: Build of instance 39a2035c-bb7b-4837-b556-e8bb38ffb514 aborted: Failed to rebuild volume backed instance. [ 829.157018] env[60175]: ERROR nova.compute.manager [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] [ 829.258203] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 829.258203] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 829.457767] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e839d615-8fbf-499c-86c4-abc83f1c0d6b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.465378] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460b62c3-c27d-478d-ba70-09cc962acc6a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.493844] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ce8ae9d-23f6-4f92-8e01-704fa73dc79d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.500428] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ecf4c2c-a894-42e3-ace3-726acdeac81d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 829.512722] env[60175]: DEBUG nova.compute.provider_tree [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.521297] env[60175]: DEBUG nova.scheduler.client.report [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 829.537907] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.279s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 829.537907] env[60175]: INFO nova.compute.manager [None req-2b0b5a59-c022-40ef-a4d0-4735537fb4a3 tempest-ServerActionsV293TestJSON-4180666 tempest-ServerActionsV293TestJSON-4180666-project-member] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Successfully reverted task state from rebuilding on failure for instance. [ 831.517855] env[60175]: DEBUG nova.compute.manager [req-28ef72d3-c20b-4007-8c91-6c48410f202f req-0978742f-e38d-41b0-9611-fb7e7d03a70b service nova] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Received event network-vif-deleted-29e209ef-b352-49cc-ac79-37612fe95e01 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 834.278474] env[60175]: DEBUG nova.compute.manager [req-40773d82-ea9a-4d9b-a2ad-8b2c479bab69 req-4dbee8d7-7308-4ab9-b673-74eb605f6192 service nova] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Received event network-vif-deleted-5e559ae9-2ba8-4907-b16c-34d4c09c7d10 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 837.676932] env[60175]: DEBUG nova.compute.manager [req-f097cf2b-8981-4fbc-9016-a228b8a7cbfe req-76f356c4-b55d-4159-ae09-128becef0c5d service nova] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Received event network-vif-deleted-9fc574b7-1869-4a09-8bd5-72d1381e6c73 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 839.835184] env[60175]: DEBUG nova.compute.manager [req-2c9c78e5-f2e1-4858-9d59-c051311607a6 req-558c545f-aee8-4a7e-83d7-0dcbf91d6d90 service nova] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Received event network-vif-deleted-60673b65-811c-44b5-a021-c476b84db981 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 846.154239] env[60175]: WARNING oslo_vmware.rw_handles [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 846.154239] env[60175]: ERROR oslo_vmware.rw_handles [ 846.155363] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 846.156630] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 846.156884] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Copying Virtual Disk [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/4433bf84-dce8-45f5-a847-37b1c1392f09/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 846.157193] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-35bdc7c1-fbdf-4378-9e7f-995ea4e9d3e2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.165740] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for the task: (returnval){ [ 846.165740] env[60175]: value = "task-4292932" [ 846.165740] env[60175]: _type = "Task" [ 846.165740] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 846.173962] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Task: {'id': task-4292932, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 846.676491] env[60175]: DEBUG oslo_vmware.exceptions [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 846.676757] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 846.677337] env[60175]: ERROR nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 846.677337] env[60175]: Faults: ['InvalidArgument'] [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Traceback (most recent call last): [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] yield resources [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self.driver.spawn(context, instance, image_meta, [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self._fetch_image_if_missing(context, vi) [ 846.677337] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] image_cache(vi, tmp_image_ds_loc) [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] vm_util.copy_virtual_disk( [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] session._wait_for_task(vmdk_copy_task) [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return self.wait_for_task(task_ref) [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return evt.wait() [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] result = hub.switch() [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 846.677928] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return self.greenlet.switch() [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self.f(*self.args, **self.kw) [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] raise exceptions.translate_fault(task_info.error) [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Faults: ['InvalidArgument'] [ 846.678567] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] [ 846.678567] env[60175]: INFO nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Terminating instance [ 846.679411] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 846.679609] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 846.680399] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 846.680554] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 846.680788] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-193564f9-8b69-4ca6-91cc-66baf08282eb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.683519] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1259e154-4282-4c30-8929-f7472a9b05be {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.690618] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 846.690848] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9393ccd5-2d62-4a83-a096-5d780f81f00c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.693648] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 846.693820] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 846.694961] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c5a84d6-5856-46bf-9365-21db1fa5badc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.703497] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 846.703497] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52230344-97ec-af9a-f538-359dd8d75808" [ 846.703497] env[60175]: _type = "Task" [ 846.703497] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 846.711421] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52230344-97ec-af9a-f538-359dd8d75808, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 846.760800] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 846.763021] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 846.763021] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Deleting the datastore file [datastore2] 3107f9c0-9a35-424c-9fa3-d60057b9ceec {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 846.763021] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-187ce5da-bbe7-44a3-920b-c391e36696ec {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.770312] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for the task: (returnval){ [ 846.770312] env[60175]: value = "task-4292934" [ 846.770312] env[60175]: _type = "Task" [ 846.770312] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 846.779957] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Task: {'id': task-4292934, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.214475] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 847.214767] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating directory with path [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 847.215033] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a33e5da2-22be-4519-8f4d-f1623dfd5db6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.226600] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created directory with path [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 847.226804] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Fetch image to [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 847.226966] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 847.227875] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-724280be-311c-4fc5-a0bc-123b8ff1294d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.234828] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-265bd17b-f471-4c67-bafd-a2fc33a45877 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.243676] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c7b0292-dfe3-4bbd-b52a-22cf8162c469 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.278022] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-330d677f-8149-4477-8d0d-20ce886c5f1f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.285918] env[60175]: DEBUG oslo_vmware.api [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Task: {'id': task-4292934, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072729} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 847.292022] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 847.292022] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 847.292022] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 847.292022] env[60175]: INFO nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Took 0.61 seconds to destroy the instance on the hypervisor. [ 847.292022] env[60175]: DEBUG nova.compute.claims [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 847.292661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.292661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.293705] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-95a681d7-da78-4010-9608-596a21661f2c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.386157] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 847.453231] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 847.508535] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 847.508726] env[60175]: DEBUG oslo_vmware.rw_handles [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 847.564446] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62db5467-1fbf-4fec-b78d-8416706dfcb6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.572408] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e674197-8afd-4158-b84e-0bb2ab953958 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.603026] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fc58f9-65e8-4117-8f0e-70985d2b6af7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.610604] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9da47556-e16b-42e3-93db-791bd99aee24 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.623972] env[60175]: DEBUG nova.compute.provider_tree [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 847.633629] env[60175]: DEBUG nova.scheduler.client.report [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 847.646365] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.355s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 847.646890] env[60175]: ERROR nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.646890] env[60175]: Faults: ['InvalidArgument'] [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Traceback (most recent call last): [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self.driver.spawn(context, instance, image_meta, [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self._fetch_image_if_missing(context, vi) [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] image_cache(vi, tmp_image_ds_loc) [ 847.646890] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] vm_util.copy_virtual_disk( [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] session._wait_for_task(vmdk_copy_task) [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return self.wait_for_task(task_ref) [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return evt.wait() [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] result = hub.switch() [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] return self.greenlet.switch() [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 847.647362] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] self.f(*self.args, **self.kw) [ 847.647865] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 847.647865] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] raise exceptions.translate_fault(task_info.error) [ 847.647865] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.647865] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Faults: ['InvalidArgument'] [ 847.647865] env[60175]: ERROR nova.compute.manager [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] [ 847.647865] env[60175]: DEBUG nova.compute.utils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 847.649085] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Build of instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec was re-scheduled: A specified parameter was not correct: fileType [ 847.649085] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 847.649462] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 847.649656] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 847.649822] env[60175]: DEBUG nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 847.649938] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 848.353566] env[60175]: DEBUG nova.network.neutron [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.370384] env[60175]: INFO nova.compute.manager [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Took 0.72 seconds to deallocate network for instance. [ 848.483499] env[60175]: INFO nova.scheduler.client.report [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Deleted allocations for instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec [ 848.504271] env[60175]: DEBUG oslo_concurrency.lockutils [None req-db70b45a-6a6e-4c98-9d0f-1e52ed4a9078 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.096s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.505413] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 92.359s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.507802] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Acquiring lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.507802] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.507802] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.508931] env[60175]: INFO nova.compute.manager [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Terminating instance [ 848.511291] env[60175]: DEBUG nova.compute.manager [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 848.511491] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 848.511822] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-03529fd9-c8f6-4a80-9c54-5d197e67952d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.520507] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a918ada7-ce6b-458a-b74d-bf9d95caa514 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.534700] env[60175]: DEBUG nova.compute.manager [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] [instance: 6c94c59c-44ab-4cb9-8480-18e8a424993b] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.555619] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3107f9c0-9a35-424c-9fa3-d60057b9ceec could not be found. [ 848.555832] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 848.556009] env[60175]: INFO nova.compute.manager [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Took 0.04 seconds to destroy the instance on the hypervisor. [ 848.556282] env[60175]: DEBUG oslo.service.loopingcall [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 848.556522] env[60175]: DEBUG nova.compute.manager [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 848.556624] env[60175]: DEBUG nova.network.neutron [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 848.561328] env[60175]: DEBUG nova.compute.manager [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] [instance: 6c94c59c-44ab-4cb9-8480-18e8a424993b] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.581072] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f7aeea98-40ba-4b13-96a0-735947cb1400 tempest-ServerMetadataNegativeTestJSON-404749069 tempest-ServerMetadataNegativeTestJSON-404749069-project-member] Lock "6c94c59c-44ab-4cb9-8480-18e8a424993b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.036s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.592113] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.601097] env[60175]: DEBUG nova.network.neutron [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.612332] env[60175]: INFO nova.compute.manager [-] [instance: 3107f9c0-9a35-424c-9fa3-d60057b9ceec] Took 0.05 seconds to deallocate network for instance. [ 848.649772] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.650149] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.651838] env[60175]: INFO nova.compute.claims [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 848.725593] env[60175]: DEBUG oslo_concurrency.lockutils [None req-34563205-bcf8-48af-b615-d61e11925448 tempest-ServerDiagnosticsNegativeTest-225692876 tempest-ServerDiagnosticsNegativeTest-225692876-project-member] Lock "3107f9c0-9a35-424c-9fa3-d60057b9ceec" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.219s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.868143] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4623c5d-c129-4726-8044-4349aea52e7f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.875990] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b887c4b-b347-4f2a-b10f-cad517e2e491 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.905231] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d416fc61-2b48-40b8-999f-fc5dfc8c855d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.913480] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-130f68b7-4951-4413-8a22-b581fed18ccd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.926334] env[60175]: DEBUG nova.compute.provider_tree [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 848.938021] env[60175]: DEBUG nova.scheduler.client.report [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 848.952726] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.953188] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 848.985448] env[60175]: DEBUG nova.compute.utils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 848.988268] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 848.988268] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 848.996612] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 849.064029] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 849.084914] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 849.086380] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 849.086380] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 849.086380] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 849.086380] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 849.086380] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 849.086625] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 849.086625] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 849.086625] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 849.086828] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 849.087271] env[60175]: DEBUG nova.virt.hardware [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 849.088352] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39cd20c2-605d-407d-9f87-d416674c4833 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.096537] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4f73f1-ba02-4a71-be62-0f23ff6c3910 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.343281] env[60175]: DEBUG nova.policy [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '324274af4ca54998ab7056451e9a0ace', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '72d50358bf2c41d5a556afb101074e8e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 850.008917] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Successfully created port: eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 850.582349] env[60175]: DEBUG nova.compute.manager [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Received event network-vif-plugged-eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 850.582566] env[60175]: DEBUG oslo_concurrency.lockutils [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] Acquiring lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.582772] env[60175]: DEBUG oslo_concurrency.lockutils [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] Lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.582932] env[60175]: DEBUG oslo_concurrency.lockutils [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] Lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.583322] env[60175]: DEBUG nova.compute.manager [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] No waiting events found dispatching network-vif-plugged-eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 850.583541] env[60175]: WARNING nova.compute.manager [req-47c58ba2-51d8-4e2f-a513-57901935ff80 req-3d6eec62-e378-4741-9509-e9950cdae2d8 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Received unexpected event network-vif-plugged-eceff385-d2a9-47ed-8510-3db604f00a8f for instance with vm_state building and task_state spawning. [ 850.665048] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Successfully updated port: eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 850.674500] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 850.674645] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquired lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 850.674792] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 850.723018] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 850.996554] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Updating instance_info_cache with network_info: [{"id": "eceff385-d2a9-47ed-8510-3db604f00a8f", "address": "fa:16:3e:68:0f:45", "network": {"id": "bf739c39-d2c2-4778-b7e0-4125fcfb9027", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1397118134-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "72d50358bf2c41d5a556afb101074e8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2b6a4065-12af-4fb9-ac47-ec9143f7297e", "external-id": "nsx-vlan-transportzone-95", "segmentation_id": 95, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeceff385-d2", "ovs_interfaceid": "eceff385-d2a9-47ed-8510-3db604f00a8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 851.007777] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Releasing lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 851.008539] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance network_info: |[{"id": "eceff385-d2a9-47ed-8510-3db604f00a8f", "address": "fa:16:3e:68:0f:45", "network": {"id": "bf739c39-d2c2-4778-b7e0-4125fcfb9027", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1397118134-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "72d50358bf2c41d5a556afb101074e8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2b6a4065-12af-4fb9-ac47-ec9143f7297e", "external-id": "nsx-vlan-transportzone-95", "segmentation_id": 95, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeceff385-d2", "ovs_interfaceid": "eceff385-d2a9-47ed-8510-3db604f00a8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 851.008670] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:0f:45', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2b6a4065-12af-4fb9-ac47-ec9143f7297e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'eceff385-d2a9-47ed-8510-3db604f00a8f', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 851.016347] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Creating folder: Project (72d50358bf2c41d5a556afb101074e8e). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.016895] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-482b3397-86fc-493e-b2a3-7ed02d6fe729 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.027403] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Created folder: Project (72d50358bf2c41d5a556afb101074e8e) in parent group-v845475. [ 851.027589] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Creating folder: Instances. Parent ref: group-v845531. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.027801] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dbbfbb46-e5e5-4846-b281-6c74dbda1d3c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.036554] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Created folder: Instances in parent group-v845531. [ 851.036767] env[60175]: DEBUG oslo.service.loopingcall [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 851.036938] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 851.037136] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b55e88db-699b-4f8c-85f0-da829bf3a32c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.055777] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 851.055777] env[60175]: value = "task-4292937" [ 851.055777] env[60175]: _type = "Task" [ 851.055777] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 851.062778] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292937, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 851.565420] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292937, 'name': CreateVM_Task, 'duration_secs': 0.273571} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 851.565547] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 851.566227] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 851.566369] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 851.566682] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 851.566917] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d5786f6b-fe4d-4ec7-97b2-ab12086ef6ab {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.571240] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Waiting for the task: (returnval){ [ 851.571240] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5297ace5-5a42-99dc-57db-c8b48a133925" [ 851.571240] env[60175]: _type = "Task" [ 851.571240] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 851.578441] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5297ace5-5a42-99dc-57db-c8b48a133925, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 852.083814] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 852.085105] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 852.085105] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 852.609152] env[60175]: DEBUG nova.compute.manager [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Received event network-changed-eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 852.609336] env[60175]: DEBUG nova.compute.manager [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Refreshing instance network info cache due to event network-changed-eceff385-d2a9-47ed-8510-3db604f00a8f. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 852.609628] env[60175]: DEBUG oslo_concurrency.lockutils [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] Acquiring lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 852.609771] env[60175]: DEBUG oslo_concurrency.lockutils [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] Acquired lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 852.609931] env[60175]: DEBUG nova.network.neutron [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Refreshing network info cache for port eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 853.290381] env[60175]: DEBUG nova.network.neutron [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Updated VIF entry in instance network info cache for port eceff385-d2a9-47ed-8510-3db604f00a8f. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 853.290381] env[60175]: DEBUG nova.network.neutron [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Updating instance_info_cache with network_info: [{"id": "eceff385-d2a9-47ed-8510-3db604f00a8f", "address": "fa:16:3e:68:0f:45", "network": {"id": "bf739c39-d2c2-4778-b7e0-4125fcfb9027", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1397118134-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "72d50358bf2c41d5a556afb101074e8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2b6a4065-12af-4fb9-ac47-ec9143f7297e", "external-id": "nsx-vlan-transportzone-95", "segmentation_id": 95, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeceff385-d2", "ovs_interfaceid": "eceff385-d2a9-47ed-8510-3db604f00a8f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 853.300436] env[60175]: DEBUG oslo_concurrency.lockutils [req-cfbcbca0-1182-4eb1-bd34-989ad9e1c9f0 req-05d1ac39-29c7-4ed9-a01f-f0c8e1b54a73 service nova] Releasing lock "refresh_cache-53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 854.639878] env[60175]: DEBUG nova.compute.manager [req-d16f2b79-d9ed-4a2c-a0ab-476cd79f093f req-4674c540-be87-4ba1-afb4-31a3beb89c72 service nova] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Received event network-vif-deleted-eceff385-d2a9-47ed-8510-3db604f00a8f {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 865.949588] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 865.959514] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 865.959728] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.959885] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 865.960043] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 865.961097] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa60161-33bb-4c83-a0fa-646e8eabdd81 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.969680] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a1d668-8f18-440d-9055-338acfc550ef {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.983344] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-762f33eb-6f54-4f82-8dd5-39d57f26599f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.989373] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f35c564-0b90-4039-9a4b-d9561a05b6fd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.018179] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180718MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 866.018341] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 866.018542] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 866.082233] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 866.082233] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 866.082233] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 866.082233] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 866.082458] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 866.095119] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 070d142d-6a47-49bc-a061-3101da79447a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 866.107335] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 866.107557] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 866.107700] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=149GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 866.204675] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f50b698-95f5-49a5-aff8-777f7f2a700d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.212696] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62dc7941-8dfb-48f9-a1ff-42640652aac0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.248648] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c8ee00-114c-4fb8-96b7-9d0772109e78 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.256301] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2a48ad-3916-4ef7-b5c9-ce0a2ed9558a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.269720] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 866.277722] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 866.290021] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 866.290329] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 867.285572] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.285868] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.285973] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.949920] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.950241] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.950602] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 868.950602] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 868.964795] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 868.964939] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 868.965131] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 868.965299] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 868.965456] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 868.965609] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 869.950056] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 869.966034] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 869.966354] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 869.966354] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 870.951212] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.246594] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.246856] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.415711] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "a45c150e-942b-454a-ab59-aa6b191bfada" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.416069] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "a45c150e-942b-454a-ab59-aa6b191bfada" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.626337] env[60175]: WARNING oslo_vmware.rw_handles [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 896.626337] env[60175]: ERROR oslo_vmware.rw_handles [ 896.627044] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 896.628442] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 896.628681] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Copying Virtual Disk [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/0a886e88-5a47-4a0a-a413-f15cafe5168b/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 896.628965] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d3b5f2ec-8ec0-4222-83ca-4a081b06850a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.637143] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 896.637143] env[60175]: value = "task-4292938" [ 896.637143] env[60175]: _type = "Task" [ 896.637143] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 896.645671] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': task-4292938, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.147467] env[60175]: DEBUG oslo_vmware.exceptions [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 897.147728] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 897.148280] env[60175]: ERROR nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.148280] env[60175]: Faults: ['InvalidArgument'] [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Traceback (most recent call last): [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] yield resources [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self.driver.spawn(context, instance, image_meta, [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self._vmops.spawn(context, instance, image_meta, injected_files, [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self._fetch_image_if_missing(context, vi) [ 897.148280] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] image_cache(vi, tmp_image_ds_loc) [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] vm_util.copy_virtual_disk( [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] session._wait_for_task(vmdk_copy_task) [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return self.wait_for_task(task_ref) [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return evt.wait() [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] result = hub.switch() [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 897.148800] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return self.greenlet.switch() [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self.f(*self.args, **self.kw) [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] raise exceptions.translate_fault(task_info.error) [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Faults: ['InvalidArgument'] [ 897.149194] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] [ 897.149194] env[60175]: INFO nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Terminating instance [ 897.150138] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 897.150337] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.150564] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c3dc45a-be43-4ca6-98f1-190102cf2d2c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.152991] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 897.153209] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 897.153956] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eb3bd23-dd71-4498-8941-e04d958d40ea {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.160482] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 897.160680] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c0024b14-3279-4020-a59b-b67da9cbde78 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.162810] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.162975] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 897.163879] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0235753b-a76a-4946-9967-3f2359bb92a3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.169569] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for the task: (returnval){ [ 897.169569] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e67409-78e5-5b9e-d7c4-d264dd91562a" [ 897.169569] env[60175]: _type = "Task" [ 897.169569] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.176412] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e67409-78e5-5b9e-d7c4-d264dd91562a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.231572] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 897.231820] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 897.231968] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Deleting the datastore file [datastore2] 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 897.232248] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cad5cff7-168d-41ab-a2c9-27e138216fc4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.239057] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 897.239057] env[60175]: value = "task-4292940" [ 897.239057] env[60175]: _type = "Task" [ 897.239057] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.246549] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': task-4292940, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.680539] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 897.680839] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Creating directory with path [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.680987] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0619bec-0fa5-4c3b-ae2c-1101d5a7e4cb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.691669] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Created directory with path [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.691848] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Fetch image to [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 897.692033] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 897.692718] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-725612ba-0684-42fb-b2e9-6e2f8240476f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.698961] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16182ff2-a374-45b0-b471-c0932ed3e4af {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.707631] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ceaa15-6bef-4d0f-a892-b68ec5da9ac1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.738852] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6865bf73-3418-484a-9bee-4975fe32ba31 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.749130] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f4db47f7-61b2-4b46-b0b8-34f5a0f9131e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.750711] env[60175]: DEBUG oslo_vmware.api [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': task-4292940, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076694} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 897.750958] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 897.751148] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 897.751311] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 897.751481] env[60175]: INFO nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Took 0.60 seconds to destroy the instance on the hypervisor. [ 897.753538] env[60175]: DEBUG nova.compute.claims [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 897.753698] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 897.753900] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 897.770390] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 897.813984] env[60175]: DEBUG oslo_vmware.rw_handles [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 897.870809] env[60175]: DEBUG oslo_vmware.rw_handles [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 897.871024] env[60175]: DEBUG oslo_vmware.rw_handles [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 897.942075] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-726f2505-fef8-46a8-b3b7-3ae901c914c5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.949650] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437b4632-df36-4932-948f-a5de7a58f40f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.979459] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d488e33-ea1d-4ff4-a758-d628b325e233 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.986397] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ccbd51a-8b17-4b95-a262-a1845ef5e758 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.999199] env[60175]: DEBUG nova.compute.provider_tree [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.007740] env[60175]: DEBUG nova.scheduler.client.report [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.020981] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.021613] env[60175]: ERROR nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.021613] env[60175]: Faults: ['InvalidArgument'] [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Traceback (most recent call last): [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self.driver.spawn(context, instance, image_meta, [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self._vmops.spawn(context, instance, image_meta, injected_files, [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self._fetch_image_if_missing(context, vi) [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] image_cache(vi, tmp_image_ds_loc) [ 898.021613] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] vm_util.copy_virtual_disk( [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] session._wait_for_task(vmdk_copy_task) [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return self.wait_for_task(task_ref) [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return evt.wait() [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] result = hub.switch() [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] return self.greenlet.switch() [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 898.022086] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] self.f(*self.args, **self.kw) [ 898.022451] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 898.022451] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] raise exceptions.translate_fault(task_info.error) [ 898.022451] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.022451] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Faults: ['InvalidArgument'] [ 898.022451] env[60175]: ERROR nova.compute.manager [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] [ 898.022451] env[60175]: DEBUG nova.compute.utils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 898.023919] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Build of instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 was re-scheduled: A specified parameter was not correct: fileType [ 898.023919] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 898.024302] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 898.024525] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 898.024608] env[60175]: DEBUG nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 898.024768] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 898.244801] env[60175]: DEBUG nova.network.neutron [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 898.254910] env[60175]: INFO nova.compute.manager [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Took 0.23 seconds to deallocate network for instance. [ 898.343449] env[60175]: INFO nova.scheduler.client.report [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Deleted allocations for instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 [ 898.360035] env[60175]: DEBUG oslo_concurrency.lockutils [None req-a54d284d-fd1b-4d30-a01a-724c3a3d446c tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 339.252s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.361132] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 139.221s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.361349] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.361754] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.361754] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.363732] env[60175]: INFO nova.compute.manager [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Terminating instance [ 898.365587] env[60175]: DEBUG nova.compute.manager [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 898.365782] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 898.366039] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b25556ac-219e-4631-9b55-dd10adcb419c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.376089] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae87726-88ed-4225-a479-ebe77e856296 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.387019] env[60175]: DEBUG nova.compute.manager [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 71244679-78d6-4d49-b4b5-ef96fd313ae8] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.408517] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658 could not be found. [ 898.408738] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 898.408932] env[60175]: INFO nova.compute.manager [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Took 0.04 seconds to destroy the instance on the hypervisor. [ 898.409212] env[60175]: DEBUG oslo.service.loopingcall [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 898.409425] env[60175]: DEBUG nova.compute.manager [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 898.409548] env[60175]: DEBUG nova.network.neutron [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 898.416128] env[60175]: DEBUG nova.compute.manager [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] [instance: 71244679-78d6-4d49-b4b5-ef96fd313ae8] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.435443] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6171457e-e860-4a70-9e29-308983af6168 tempest-SecurityGroupsTestJSON-754098049 tempest-SecurityGroupsTestJSON-754098049-project-member] Lock "71244679-78d6-4d49-b4b5-ef96fd313ae8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.140s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.444728] env[60175]: DEBUG nova.compute.manager [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] [instance: fb825c5f-bd66-40aa-8027-cb425f3b9b96] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.447320] env[60175]: DEBUG nova.network.neutron [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 898.453893] env[60175]: INFO nova.compute.manager [-] [instance: 7082a2a5-377a-47d2-bfbb-c7eb8b1c8658] Took 0.04 seconds to deallocate network for instance. [ 898.474580] env[60175]: DEBUG nova.compute.manager [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] [instance: fb825c5f-bd66-40aa-8027-cb425f3b9b96] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.495918] env[60175]: DEBUG oslo_concurrency.lockutils [None req-554d3b0c-04fc-4d9e-922c-f3b7f39c74e5 tempest-InstanceActionsTestJSON-1474696113 tempest-InstanceActionsTestJSON-1474696113-project-member] Lock "fb825c5f-bd66-40aa-8027-cb425f3b9b96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.135s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.511528] env[60175]: DEBUG nova.compute.manager [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: c76409ad-b0aa-4da6-ac83-58f617ec2588] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.535576] env[60175]: DEBUG nova.compute.manager [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] [instance: c76409ad-b0aa-4da6-ac83-58f617ec2588] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.554208] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bb555319-02fe-4615-a8eb-85e9ee3f321d tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "7082a2a5-377a-47d2-bfbb-c7eb8b1c8658" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.562008] env[60175]: DEBUG oslo_concurrency.lockutils [None req-27f74e42-677d-4112-bb9f-55e756a3a7fd tempest-DeleteServersTestJSON-1745354174 tempest-DeleteServersTestJSON-1745354174-project-member] Lock "c76409ad-b0aa-4da6-ac83-58f617ec2588" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.615s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.569714] env[60175]: DEBUG nova.compute.manager [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] [instance: 63823a4b-97e0-48f9-9fb9-7c4fe3858343] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.590873] env[60175]: DEBUG nova.compute.manager [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] [instance: 63823a4b-97e0-48f9-9fb9-7c4fe3858343] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.609200] env[60175]: DEBUG oslo_concurrency.lockutils [None req-94915086-54d2-4bcf-bcb7-396933e24f9e tempest-ServerGroupTestJSON-801235064 tempest-ServerGroupTestJSON-801235064-project-member] Lock "63823a4b-97e0-48f9-9fb9-7c4fe3858343" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.436s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.616441] env[60175]: DEBUG nova.compute.manager [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: 070d142d-6a47-49bc-a061-3101da79447a] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.637390] env[60175]: DEBUG nova.compute.manager [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: 070d142d-6a47-49bc-a061-3101da79447a] Instance disappeared before build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.658286] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3d92add3-634e-4b1e-8d5a-9590d765273e tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "070d142d-6a47-49bc-a061-3101da79447a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.584s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.668347] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.711881] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.712157] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.713602] env[60175]: INFO nova.compute.claims [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 898.838569] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e44bd8f-93ce-42d9-8ee9-84f6ea7b0374 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.845947] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db72820b-0d41-40d7-8289-d0a8cedaae6a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.876015] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b904eb8-fae2-4328-8b35-cc53b1590d9d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.882481] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38d2dcd0-fafe-4351-bf7a-f894ac2b5140 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.895396] env[60175]: DEBUG nova.compute.provider_tree [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.903657] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.916257] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.916755] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 898.948479] env[60175]: DEBUG nova.compute.utils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 898.949929] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 898.950136] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 898.957879] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 899.015204] env[60175]: DEBUG nova.policy [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6b2c42f3195e46788132fb07d6b771ae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e15efea8f51244ebab9a72c9fcd83456', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 899.018271] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 899.039030] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 899.040037] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 899.040037] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 899.040037] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 899.040037] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 899.040037] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 899.040265] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 899.040301] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 899.040664] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 899.040664] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 899.040824] env[60175]: DEBUG nova.virt.hardware [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 899.042033] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-210c4a12-8318-404e-9092-ca1bccbc1d4e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.049384] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9f29bcd-de78-467f-9909-08563fda067f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.320714] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Successfully created port: 4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 900.216423] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.216702] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.403354] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Successfully updated port: 4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 900.415247] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 900.415387] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquired lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 900.415541] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 900.432561] env[60175]: DEBUG nova.compute.manager [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Received event network-vif-plugged-4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 900.432780] env[60175]: DEBUG oslo_concurrency.lockutils [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] Acquiring lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.432984] env[60175]: DEBUG oslo_concurrency.lockutils [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.433169] env[60175]: DEBUG oslo_concurrency.lockutils [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.433330] env[60175]: DEBUG nova.compute.manager [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] No waiting events found dispatching network-vif-plugged-4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 900.433489] env[60175]: WARNING nova.compute.manager [req-7747d29a-f23e-42ad-82a9-88ff29f4b586 req-1c76b289-6c47-41cb-a190-9fc141c7e7d1 service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Received unexpected event network-vif-plugged-4457bd5e-7c83-43e8-90d8-1814e9d40cda for instance with vm_state building and task_state spawning. [ 900.471877] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 900.641317] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updating instance_info_cache with network_info: [{"id": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "address": "fa:16:3e:de:52:73", "network": {"id": "15025bc6-3b98-442b-bc59-7f3d69fee120", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1948601089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e15efea8f51244ebab9a72c9fcd83456", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4457bd5e-7c", "ovs_interfaceid": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.651381] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Releasing lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 900.651674] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance network_info: |[{"id": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "address": "fa:16:3e:de:52:73", "network": {"id": "15025bc6-3b98-442b-bc59-7f3d69fee120", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1948601089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e15efea8f51244ebab9a72c9fcd83456", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4457bd5e-7c", "ovs_interfaceid": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 900.652056] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:52:73', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f78b07ea-f425-4622-84f4-706a5d8820a7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4457bd5e-7c83-43e8-90d8-1814e9d40cda', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 900.659436] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Creating folder: Project (e15efea8f51244ebab9a72c9fcd83456). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.659904] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bfb28a7b-a8a9-495e-a300-f4c7daf3e768 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.672976] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Created folder: Project (e15efea8f51244ebab9a72c9fcd83456) in parent group-v845475. [ 900.673177] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Creating folder: Instances. Parent ref: group-v845534. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.673389] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9b3be3ee-f086-4010-997c-c81d6b466396 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.684189] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Created folder: Instances in parent group-v845534. [ 900.684409] env[60175]: DEBUG oslo.service.loopingcall [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 900.684577] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 900.684759] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-85b871b0-f329-403a-a6e0-8b0ce298110c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.703504] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 900.703504] env[60175]: value = "task-4292943" [ 900.703504] env[60175]: _type = "Task" [ 900.703504] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.710572] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292943, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.214097] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292943, 'name': CreateVM_Task, 'duration_secs': 0.279116} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 901.214309] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 901.214967] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.215155] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 901.215466] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 901.215700] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82ac0fce-cf22-4153-b161-3d3e817966b5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.220428] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for the task: (returnval){ [ 901.220428] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e33e4d-75c7-73bf-6cce-1766d3c98b2b" [ 901.220428] env[60175]: _type = "Task" [ 901.220428] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 901.227909] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e33e4d-75c7-73bf-6cce-1766d3c98b2b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.730648] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 901.730908] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 901.731127] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 902.465444] env[60175]: DEBUG nova.compute.manager [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Received event network-changed-4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 902.465705] env[60175]: DEBUG nova.compute.manager [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Refreshing instance network info cache due to event network-changed-4457bd5e-7c83-43e8-90d8-1814e9d40cda. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 902.465937] env[60175]: DEBUG oslo_concurrency.lockutils [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] Acquiring lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 902.466119] env[60175]: DEBUG oslo_concurrency.lockutils [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] Acquired lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 902.466285] env[60175]: DEBUG nova.network.neutron [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Refreshing network info cache for port 4457bd5e-7c83-43e8-90d8-1814e9d40cda {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 902.798703] env[60175]: DEBUG nova.network.neutron [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updated VIF entry in instance network info cache for port 4457bd5e-7c83-43e8-90d8-1814e9d40cda. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 902.800070] env[60175]: DEBUG nova.network.neutron [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updating instance_info_cache with network_info: [{"id": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "address": "fa:16:3e:de:52:73", "network": {"id": "15025bc6-3b98-442b-bc59-7f3d69fee120", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1948601089-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e15efea8f51244ebab9a72c9fcd83456", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f78b07ea-f425-4622-84f4-706a5d8820a7", "external-id": "nsx-vlan-transportzone-126", "segmentation_id": 126, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4457bd5e-7c", "ovs_interfaceid": "4457bd5e-7c83-43e8-90d8-1814e9d40cda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 902.808481] env[60175]: DEBUG oslo_concurrency.lockutils [req-20f5237d-ded1-47d7-b49d-ba0ca413e265 req-8db04c96-e688-407c-8fce-c56fd574e65a service nova] Releasing lock "refresh_cache-67cfe7ba-4590-451b-9e1a-340977b597a4" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 925.950455] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.960439] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 925.960648] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 925.960795] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 925.960941] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 925.962065] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d4e80df-b208-4b7e-93c1-f8b5d4023582 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.970936] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b112c5-73b6-4cb7-b372-b81ce54da142 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.984680] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a6d8a2e-a777-4e26-ad4b-019d30870116 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.990620] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b03242ed-2264-42d0-a79a-fa4d1c386d59 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.020057] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180718MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 926.020187] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 926.020405] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 926.067883] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.068069] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.068172] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.068293] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.068406] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.079454] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 926.088935] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance a45c150e-942b-454a-ab59-aa6b191bfada has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 926.097724] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 926.097927] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 926.098098] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=149GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 926.196107] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e3e467-25cc-4627-a781-d2d33d8824a9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.198987] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31970d9d-8410-4c4d-b968-b054f1545754 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.229795] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d24fcb33-3617-47cc-be39-005a085cb70b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.237172] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-710959d2-78c5-4853-b49c-e59b192742e3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.250403] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 926.258639] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 926.271594] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 926.271805] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.251s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.272035] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 927.950092] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.945717] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 928.949265] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 929.950960] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 929.951301] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 929.951301] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 929.966587] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 929.966782] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 929.966931] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 929.967067] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 929.967194] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 929.967314] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 930.950248] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.950471] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 931.951625] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 931.952043] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 944.218226] env[60175]: WARNING oslo_vmware.rw_handles [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 944.218226] env[60175]: ERROR oslo_vmware.rw_handles [ 944.218803] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 944.220543] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 944.220783] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Copying Virtual Disk [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/6459550b-d14e-4a10-89dd-f8fbea3013be/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 944.221063] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3970866d-949e-47ba-a3ce-8afe21676fd0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.228250] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for the task: (returnval){ [ 944.228250] env[60175]: value = "task-4292944" [ 944.228250] env[60175]: _type = "Task" [ 944.228250] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.236272] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Task: {'id': task-4292944, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.738546] env[60175]: DEBUG oslo_vmware.exceptions [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 944.738808] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 944.739413] env[60175]: ERROR nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 944.739413] env[60175]: Faults: ['InvalidArgument'] [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Traceback (most recent call last): [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] yield resources [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self.driver.spawn(context, instance, image_meta, [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self._vmops.spawn(context, instance, image_meta, injected_files, [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self._fetch_image_if_missing(context, vi) [ 944.739413] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] image_cache(vi, tmp_image_ds_loc) [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] vm_util.copy_virtual_disk( [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] session._wait_for_task(vmdk_copy_task) [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return self.wait_for_task(task_ref) [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return evt.wait() [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] result = hub.switch() [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 944.739733] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return self.greenlet.switch() [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self.f(*self.args, **self.kw) [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] raise exceptions.translate_fault(task_info.error) [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Faults: ['InvalidArgument'] [ 944.740039] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] [ 944.740039] env[60175]: INFO nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Terminating instance [ 944.741899] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 944.741899] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 944.741899] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f31d9f0e-0c6f-4a89-b4bb-685b7d356432 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.744026] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 944.744167] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 944.744888] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d1ed751-8d47-4aa6-ae99-e304d9d9f403 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.752148] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 944.752375] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-90c6232c-3501-4690-9e52-899db1371570 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.754615] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 944.754784] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 944.755737] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88cf3dc2-6cf2-4c5b-a528-e25b80c396c9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.760980] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for the task: (returnval){ [ 944.760980] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e2562d-c37d-3e81-c907-87ca4f5b90f6" [ 944.760980] env[60175]: _type = "Task" [ 944.760980] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.770233] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52e2562d-c37d-3e81-c907-87ca4f5b90f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 944.827136] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 944.827252] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 944.827422] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Deleting the datastore file [datastore2] 8bc7299c-35d4-4e9f-a243-2834fbadd987 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 944.827683] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a8131d4c-0af5-47be-a873-aed6ebe773f2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 944.834366] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for the task: (returnval){ [ 944.834366] env[60175]: value = "task-4292946" [ 944.834366] env[60175]: _type = "Task" [ 944.834366] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 944.842525] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Task: {'id': task-4292946, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 945.271924] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 945.272280] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Creating directory with path [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 945.272374] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8191474-eb28-4306-b7af-04ecfe02b087 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.283309] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Created directory with path [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 945.283485] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Fetch image to [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 945.283646] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 945.284368] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c23defbc-a23c-47b4-9bda-ad9275da1301 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.290567] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db761f90-f5cb-45e5-bc9e-8bf7eec9f39a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.300386] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3b1db41-c241-45ab-9602-a2280ae9e34c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.330353] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe4059f9-7269-452a-ba64-06df5555cc6b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.338081] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1570088c-76ab-491d-9cec-603cd625227d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.344211] env[60175]: DEBUG oslo_vmware.api [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Task: {'id': task-4292946, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068616} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 945.344479] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 945.344670] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 945.344836] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 945.345015] env[60175]: INFO nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Took 0.60 seconds to destroy the instance on the hypervisor. [ 945.347156] env[60175]: DEBUG nova.compute.claims [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 945.347334] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.347595] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.358434] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 945.406259] env[60175]: DEBUG oslo_vmware.rw_handles [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 945.463349] env[60175]: DEBUG oslo_vmware.rw_handles [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 945.463532] env[60175]: DEBUG oslo_vmware.rw_handles [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 945.534991] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba00c70a-b146-48b4-a565-f0131791f9cd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.542233] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95ec49e8-4a31-4b84-8b82-88caf2a377d7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.572367] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23f04911-d772-47b2-8719-dc397a2ef08d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.578957] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a16ec63b-87d2-494f-ac78-faba728c01ef {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.591856] env[60175]: DEBUG nova.compute.provider_tree [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 945.600235] env[60175]: DEBUG nova.scheduler.client.report [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 945.613162] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.266s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.613667] env[60175]: ERROR nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 945.613667] env[60175]: Faults: ['InvalidArgument'] [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Traceback (most recent call last): [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self.driver.spawn(context, instance, image_meta, [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self._vmops.spawn(context, instance, image_meta, injected_files, [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self._fetch_image_if_missing(context, vi) [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] image_cache(vi, tmp_image_ds_loc) [ 945.613667] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] vm_util.copy_virtual_disk( [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] session._wait_for_task(vmdk_copy_task) [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return self.wait_for_task(task_ref) [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return evt.wait() [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] result = hub.switch() [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] return self.greenlet.switch() [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 945.613926] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] self.f(*self.args, **self.kw) [ 945.614392] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 945.614392] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] raise exceptions.translate_fault(task_info.error) [ 945.614392] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 945.614392] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Faults: ['InvalidArgument'] [ 945.614392] env[60175]: ERROR nova.compute.manager [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] [ 945.614392] env[60175]: DEBUG nova.compute.utils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 945.615684] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Build of instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 was re-scheduled: A specified parameter was not correct: fileType [ 945.615684] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 945.616067] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 945.616241] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 945.616404] env[60175]: DEBUG nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 945.616562] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 945.848131] env[60175]: DEBUG nova.network.neutron [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 945.860535] env[60175]: INFO nova.compute.manager [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Took 0.24 seconds to deallocate network for instance. [ 945.961052] env[60175]: INFO nova.scheduler.client.report [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Deleted allocations for instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 [ 945.987926] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f559d28d-7422-46e8-8a84-40de3eeebb4d tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 381.987s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.987926] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 183.356s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.987926] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Acquiring lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 945.988185] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 945.988185] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 945.989055] env[60175]: INFO nova.compute.manager [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Terminating instance [ 945.990793] env[60175]: DEBUG nova.compute.manager [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 945.991029] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 945.991497] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4a8a0ab-3475-4bdb-ace0-5399cef45f2e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 945.995268] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 946.001608] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65c671d5-63a2-4bf6-b5f2-bb34f477e7dc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.029012] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8bc7299c-35d4-4e9f-a243-2834fbadd987 could not be found. [ 946.029320] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 946.029549] env[60175]: INFO nova.compute.manager [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Took 0.04 seconds to destroy the instance on the hypervisor. [ 946.029843] env[60175]: DEBUG oslo.service.loopingcall [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 946.030109] env[60175]: DEBUG nova.compute.manager [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 946.030245] env[60175]: DEBUG nova.network.neutron [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 946.050108] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.050347] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.051735] env[60175]: INFO nova.compute.claims [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 946.061803] env[60175]: DEBUG nova.network.neutron [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 946.076669] env[60175]: INFO nova.compute.manager [-] [instance: 8bc7299c-35d4-4e9f-a243-2834fbadd987] Took 0.05 seconds to deallocate network for instance. [ 946.164228] env[60175]: DEBUG oslo_concurrency.lockutils [None req-276d08bc-7661-4710-977c-c409ab6c7661 tempest-AttachInterfacesTestJSON-384754491 tempest-AttachInterfacesTestJSON-384754491-project-member] Lock "8bc7299c-35d4-4e9f-a243-2834fbadd987" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.186760] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6043206c-26d7-4002-9f83-88908d3ded68 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.193923] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-835fb091-ce84-4243-8ee2-86626d67efae {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.223121] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-790f4817-4b4e-4733-9066-efcc6177f076 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.230396] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9670324f-788f-4a21-ad47-52cf8c45bf4f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.243588] env[60175]: DEBUG nova.compute.provider_tree [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 946.251816] env[60175]: DEBUG nova.scheduler.client.report [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 946.264679] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.265187] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 946.297876] env[60175]: DEBUG nova.compute.utils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 946.299350] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 946.299521] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 946.307723] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 946.355494] env[60175]: DEBUG nova.policy [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f6f1fc08157841f3bdd66c7e1bc5afa8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '027d599d310b4abf9ce371b09bb3253b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 946.367200] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 946.391417] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 946.391661] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 946.391815] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 946.392053] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 946.392239] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 946.392395] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 946.392594] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 946.392746] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 946.392903] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 946.393111] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 946.393290] env[60175]: DEBUG nova.virt.hardware [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 946.394117] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c96e5c0-44e0-4c9a-a155-72414a030914 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.401708] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bc86e5c-be90-4888-86ca-bee0efeddd83 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.619410] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Successfully created port: bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 946.897281] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Successfully created port: dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 947.973108] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Successfully updated port: bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 948.062303] env[60175]: DEBUG nova.compute.manager [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-vif-plugged-bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 948.062469] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Acquiring lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.062672] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.062835] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.063034] env[60175]: DEBUG nova.compute.manager [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] No waiting events found dispatching network-vif-plugged-bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 948.063204] env[60175]: WARNING nova.compute.manager [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received unexpected event network-vif-plugged-bb80456f-edd5-48f8-90d1-2263c6b3e6fa for instance with vm_state building and task_state spawning. [ 948.063359] env[60175]: DEBUG nova.compute.manager [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-changed-bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 948.063509] env[60175]: DEBUG nova.compute.manager [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Refreshing instance network info cache due to event network-changed-bb80456f-edd5-48f8-90d1-2263c6b3e6fa. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 948.065809] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Acquiring lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.066009] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Acquired lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.066186] env[60175]: DEBUG nova.network.neutron [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Refreshing network info cache for port bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 948.121988] env[60175]: DEBUG nova.network.neutron [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 948.411167] env[60175]: DEBUG nova.network.neutron [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.421763] env[60175]: DEBUG oslo_concurrency.lockutils [req-b7bd7790-5bf6-4951-afc9-1668180f7b3f req-f81c93a8-cd2b-4ef1-b44e-61f4d566dae7 service nova] Releasing lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.947148] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Successfully updated port: dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 948.956191] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.956326] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquired lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.956473] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 948.990743] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 949.290482] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updating instance_info_cache with network_info: [{"id": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "address": "fa:16:3e:8d:2e:06", "network": {"id": "35d93d8e-c794-4369-818a-56c90c185516", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1854994814", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.137", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "557aba95-8968-407a-bac2-2fae66f7c8e5", "external-id": "nsx-vlan-transportzone-45", "segmentation_id": 45, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb80456f-ed", "ovs_interfaceid": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "address": "fa:16:3e:08:74:d7", "network": {"id": "5beb09e5-5da8-486f-a57e-f05a3b5875b6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1720664003", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd85cf77-b1", "ovs_interfaceid": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.303624] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Releasing lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 949.303934] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance network_info: |[{"id": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "address": "fa:16:3e:8d:2e:06", "network": {"id": "35d93d8e-c794-4369-818a-56c90c185516", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1854994814", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.137", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "557aba95-8968-407a-bac2-2fae66f7c8e5", "external-id": "nsx-vlan-transportzone-45", "segmentation_id": 45, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb80456f-ed", "ovs_interfaceid": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "address": "fa:16:3e:08:74:d7", "network": {"id": "5beb09e5-5da8-486f-a57e-f05a3b5875b6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1720664003", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd85cf77-b1", "ovs_interfaceid": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 949.304350] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8d:2e:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '557aba95-8968-407a-bac2-2fae66f7c8e5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bb80456f-edd5-48f8-90d1-2263c6b3e6fa', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:08:74:d7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '79ece966-6187-47d7-bce7-cc39df14ac67', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dd85cf77-b1fd-4be9-8b53-49cd7b671dfd', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 949.313503] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Creating folder: Project (027d599d310b4abf9ce371b09bb3253b). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 949.313954] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e5e90ad-fb03-446f-a6a8-ae82e8dd21d3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.325238] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created folder: Project (027d599d310b4abf9ce371b09bb3253b) in parent group-v845475. [ 949.325402] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Creating folder: Instances. Parent ref: group-v845537. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 949.325604] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9168bcea-8c12-453e-8e25-475537613c7b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.334414] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created folder: Instances in parent group-v845537. [ 949.334615] env[60175]: DEBUG oslo.service.loopingcall [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 949.334776] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 949.334948] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad7f1fc7-75de-4be5-a46b-c31f0a7a3260 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.354474] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 949.354474] env[60175]: value = "task-4292949" [ 949.354474] env[60175]: _type = "Task" [ 949.354474] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.361314] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292949, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 949.864733] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292949, 'name': CreateVM_Task, 'duration_secs': 0.318394} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 949.864963] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 949.865771] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 949.865939] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 949.866284] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 949.866524] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3b5721da-b7b4-4103-9e4e-19332d22ec63 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.871336] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Waiting for the task: (returnval){ [ 949.871336] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52f5a8f5-7c4f-bd25-9981-b404cc5837f1" [ 949.871336] env[60175]: _type = "Task" [ 949.871336] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 949.880484] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52f5a8f5-7c4f-bd25-9981-b404cc5837f1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 950.091153] env[60175]: DEBUG nova.compute.manager [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-vif-plugged-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.091445] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Acquiring lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.091592] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.091708] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.091909] env[60175]: DEBUG nova.compute.manager [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] No waiting events found dispatching network-vif-plugged-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 950.092071] env[60175]: WARNING nova.compute.manager [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received unexpected event network-vif-plugged-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd for instance with vm_state building and task_state spawning. [ 950.092232] env[60175]: DEBUG nova.compute.manager [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-changed-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.092381] env[60175]: DEBUG nova.compute.manager [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Refreshing instance network info cache due to event network-changed-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 950.092554] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Acquiring lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.092685] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Acquired lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.092903] env[60175]: DEBUG nova.network.neutron [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Refreshing network info cache for port dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 950.338494] env[60175]: DEBUG nova.network.neutron [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updated VIF entry in instance network info cache for port dd85cf77-b1fd-4be9-8b53-49cd7b671dfd. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 950.338926] env[60175]: DEBUG nova.network.neutron [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updating instance_info_cache with network_info: [{"id": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "address": "fa:16:3e:8d:2e:06", "network": {"id": "35d93d8e-c794-4369-818a-56c90c185516", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1854994814", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.137", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "557aba95-8968-407a-bac2-2fae66f7c8e5", "external-id": "nsx-vlan-transportzone-45", "segmentation_id": 45, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb80456f-ed", "ovs_interfaceid": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "address": "fa:16:3e:08:74:d7", "network": {"id": "5beb09e5-5da8-486f-a57e-f05a3b5875b6", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1720664003", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "79ece966-6187-47d7-bce7-cc39df14ac67", "external-id": "nsx-vlan-transportzone-472", "segmentation_id": 472, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd85cf77-b1", "ovs_interfaceid": "dd85cf77-b1fd-4be9-8b53-49cd7b671dfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.348744] env[60175]: DEBUG oslo_concurrency.lockutils [req-c025a0c7-1e47-4166-9146-0e7987f52ec1 req-8ba3eeba-6061-4af8-bf12-61eab80ab23e service nova] Releasing lock "refresh_cache-e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.382377] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.382551] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 950.382759] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 985.950657] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.960820] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 985.961038] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 985.961203] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 985.961398] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 985.962500] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac74f2b-6cff-4e07-9e7f-ca8af3fb937e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.971369] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db985be5-9669-4780-9bc6-a9397d03d17a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.985105] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c964bad-3034-4bfa-9337-ee55d218b45e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.991109] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-869e8ecf-f256-422d-8ed5-fdb90ea14f9d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.020426] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180704MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 986.020569] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 986.020748] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 986.076169] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance da3eaeea-ce26-40eb-af8b-8857f927e431 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.076330] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.076455] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.076571] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.076686] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.088801] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance a45c150e-942b-454a-ab59-aa6b191bfada has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 986.102466] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 986.102724] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 986.102897] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=149GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 986.198132] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-606378a2-e652-4d94-99fc-b59e4250ffb4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.206041] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a96e87-5a69-4130-870e-06592a0f9dc6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.243439] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-557fee6b-ab9f-4fde-9b72-1c2d8f6a53be {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.252413] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffa6723e-5918-4539-8693-ea762b0883ea {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.265217] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 986.273284] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 986.286909] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 986.287102] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.266s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 989.287297] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 989.287673] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 989.950911] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 989.951120] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 989.951250] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 989.965577] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 989.965730] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 989.965862] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 989.965989] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 989.966130] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 989.966266] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 990.950202] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 990.950568] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.950622] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.951265] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 992.946294] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 992.964471] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 992.965140] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.237221] env[60175]: WARNING oslo_vmware.rw_handles [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 994.237221] env[60175]: ERROR oslo_vmware.rw_handles [ 994.237739] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 994.239794] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 994.240055] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Copying Virtual Disk [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/1500bd09-dcfc-47d9-8611-109ada745846/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 994.240372] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2903d85a-2f8d-4a4f-8d50-43ca9b7396a3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.248469] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for the task: (returnval){ [ 994.248469] env[60175]: value = "task-4292950" [ 994.248469] env[60175]: _type = "Task" [ 994.248469] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.256419] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Task: {'id': task-4292950, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 994.759259] env[60175]: DEBUG oslo_vmware.exceptions [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 994.759498] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 994.760073] env[60175]: ERROR nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.760073] env[60175]: Faults: ['InvalidArgument'] [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Traceback (most recent call last): [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] yield resources [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self.driver.spawn(context, instance, image_meta, [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self._vmops.spawn(context, instance, image_meta, injected_files, [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self._fetch_image_if_missing(context, vi) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] image_cache(vi, tmp_image_ds_loc) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] vm_util.copy_virtual_disk( [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] session._wait_for_task(vmdk_copy_task) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return self.wait_for_task(task_ref) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return evt.wait() [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] result = hub.switch() [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return self.greenlet.switch() [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self.f(*self.args, **self.kw) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] raise exceptions.translate_fault(task_info.error) [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Faults: ['InvalidArgument'] [ 994.760073] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] [ 994.760855] env[60175]: INFO nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Terminating instance [ 994.762008] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 994.762219] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 994.762471] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3de34d6c-f4b5-4d23-9365-0ebf292165ba {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.764645] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 994.764845] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 994.765561] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbda5d8-b716-4043-8393-0578d9cab6f3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.772523] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 994.772746] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3e7a33b-3abe-4881-9721-3ab168175a98 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.774952] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 994.775133] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 994.776076] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ae9fe65-6391-4b32-96ee-325da85faef3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.780537] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for the task: (returnval){ [ 994.780537] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52574821-a4b7-f64d-64c2-76ff93c9ee50" [ 994.780537] env[60175]: _type = "Task" [ 994.780537] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.788014] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52574821-a4b7-f64d-64c2-76ff93c9ee50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 994.845677] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 994.845921] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 994.846185] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Deleting the datastore file [datastore2] da3eaeea-ce26-40eb-af8b-8857f927e431 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 994.846460] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a9c27538-6e06-47bd-8766-b66e69f67305 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.853127] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for the task: (returnval){ [ 994.853127] env[60175]: value = "task-4292952" [ 994.853127] env[60175]: _type = "Task" [ 994.853127] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.860531] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Task: {'id': task-4292952, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 995.291235] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 995.291570] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Creating directory with path [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 995.291736] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-04bf1ae3-cfed-4f42-915e-b14b24abd65c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.303231] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Created directory with path [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 995.303446] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Fetch image to [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 995.303618] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 995.304319] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb529ff1-e55d-454b-aeec-971cd777e266 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.310673] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5cb26e2-afdf-4172-aa43-b880dee3018a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.319279] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd626413-e0e4-4f48-b0d8-f127d7044946 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.351240] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac66f6b6-19d7-40cf-a4fb-b98e0709e8ad {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.362316] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1d1d234b-f306-48c5-bf70-d776a7b1938c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.363988] env[60175]: DEBUG oslo_vmware.api [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Task: {'id': task-4292952, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066777} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 995.364228] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 995.364402] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 995.364567] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 995.364733] env[60175]: INFO nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Took 0.60 seconds to destroy the instance on the hypervisor. [ 995.366878] env[60175]: DEBUG nova.compute.claims [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 995.367132] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.367366] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.385798] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 995.434990] env[60175]: DEBUG oslo_vmware.rw_handles [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 995.493511] env[60175]: DEBUG oslo_vmware.rw_handles [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 995.493817] env[60175]: DEBUG oslo_vmware.rw_handles [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 995.546439] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84d286e8-5aac-423e-b6f1-bab0224ade41 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.553992] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2142f3d-0427-4b8c-aeb7-26c0d7559bbb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.582827] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33b1146-0212-4ca8-a549-99a613b8e24f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.591307] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab08af78-aa92-4c30-872c-acc884553a77 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.604947] env[60175]: DEBUG nova.compute.provider_tree [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 995.614783] env[60175]: DEBUG nova.scheduler.client.report [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 995.629711] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.630273] env[60175]: ERROR nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.630273] env[60175]: Faults: ['InvalidArgument'] [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Traceback (most recent call last): [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self.driver.spawn(context, instance, image_meta, [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self._vmops.spawn(context, instance, image_meta, injected_files, [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self._fetch_image_if_missing(context, vi) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] image_cache(vi, tmp_image_ds_loc) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] vm_util.copy_virtual_disk( [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] session._wait_for_task(vmdk_copy_task) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return self.wait_for_task(task_ref) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return evt.wait() [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] result = hub.switch() [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] return self.greenlet.switch() [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] self.f(*self.args, **self.kw) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] raise exceptions.translate_fault(task_info.error) [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Faults: ['InvalidArgument'] [ 995.630273] env[60175]: ERROR nova.compute.manager [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] [ 995.631164] env[60175]: DEBUG nova.compute.utils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 995.632414] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Build of instance da3eaeea-ce26-40eb-af8b-8857f927e431 was re-scheduled: A specified parameter was not correct: fileType [ 995.632414] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 995.632786] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 995.632953] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 995.633134] env[60175]: DEBUG nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 995.633459] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 995.933650] env[60175]: DEBUG nova.network.neutron [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 995.945591] env[60175]: INFO nova.compute.manager [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Took 0.31 seconds to deallocate network for instance. [ 996.062074] env[60175]: INFO nova.scheduler.client.report [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Deleted allocations for instance da3eaeea-ce26-40eb-af8b-8857f927e431 [ 996.079209] env[60175]: DEBUG oslo_concurrency.lockutils [None req-56a37a98-f8ba-4658-8b0c-d83f9724711b tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 433.569s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.080650] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 235.716s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.080869] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Acquiring lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.081240] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.081473] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.083267] env[60175]: INFO nova.compute.manager [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Terminating instance [ 996.084973] env[60175]: DEBUG nova.compute.manager [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 996.085264] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 996.085732] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e9a517cd-840f-46d9-bdcb-01105fbb8e44 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.094169] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 996.099581] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26c27777-1b22-47d0-bfc1-975f563e21d9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.127284] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance da3eaeea-ce26-40eb-af8b-8857f927e431 could not be found. [ 996.127479] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 996.127678] env[60175]: INFO nova.compute.manager [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Took 0.04 seconds to destroy the instance on the hypervisor. [ 996.127884] env[60175]: DEBUG oslo.service.loopingcall [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 996.128098] env[60175]: DEBUG nova.compute.manager [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 996.128196] env[60175]: DEBUG nova.network.neutron [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.144332] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.144566] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.145992] env[60175]: INFO nova.compute.claims [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 996.157726] env[60175]: DEBUG nova.network.neutron [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.166758] env[60175]: INFO nova.compute.manager [-] [instance: da3eaeea-ce26-40eb-af8b-8857f927e431] Took 0.04 seconds to deallocate network for instance. [ 996.256048] env[60175]: DEBUG oslo_concurrency.lockutils [None req-3ce60da9-8a2d-4616-b714-e3963aad11b9 tempest-ServersTestJSON-510573566 tempest-ServersTestJSON-510573566-project-member] Lock "da3eaeea-ce26-40eb-af8b-8857f927e431" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.274399] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19fabbea-00d0-40fe-aa08-02743e07d5c8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.281876] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c30d0de-2dc8-4a26-8515-5305d3f08f0b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.310857] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7041db09-eeb0-485e-962e-006a4080c542 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.317506] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bb3ac03-2d40-4a2e-8d12-772374553560 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.329888] env[60175]: DEBUG nova.compute.provider_tree [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 996.337423] env[60175]: DEBUG nova.scheduler.client.report [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 996.350865] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.350865] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 996.380701] env[60175]: DEBUG nova.compute.utils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 996.382621] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 996.382800] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 996.390796] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 996.436846] env[60175]: DEBUG nova.policy [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3ad0ae5480f1418283d649af7e0e3810', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e5e8feb5e194fddb47dd8364495bb2b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 996.451835] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 996.471271] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 996.471526] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 996.471682] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 996.471858] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 996.471997] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 996.472152] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 996.472543] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 996.472702] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 996.472981] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 996.473064] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 996.473204] env[60175]: DEBUG nova.virt.hardware [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 996.474133] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3273989-b195-4a5b-a9e3-08017489aa19 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.481753] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30a5815-e7cc-43eb-9f13-937146329e30 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.773456] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Successfully created port: 54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 997.364904] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Successfully updated port: 54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 997.376649] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 997.376793] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquired lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 997.376938] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 997.416196] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 997.576766] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Updating instance_info_cache with network_info: [{"id": "54e5c076-e188-4834-a63f-99afcfff6137", "address": "fa:16:3e:92:12:08", "network": {"id": "ded1035f-0d77-432e-9b04-e59d293f8f45", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1709423783-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e5e8feb5e194fddb47dd8364495bb2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c8a5d7c-ee1f-4a41-94e4-db31e85a398d", "external-id": "cl2-zone-613", "segmentation_id": 613, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e5c076-e1", "ovs_interfaceid": "54e5c076-e188-4834-a63f-99afcfff6137", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 997.586891] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Releasing lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 997.587174] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance network_info: |[{"id": "54e5c076-e188-4834-a63f-99afcfff6137", "address": "fa:16:3e:92:12:08", "network": {"id": "ded1035f-0d77-432e-9b04-e59d293f8f45", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1709423783-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e5e8feb5e194fddb47dd8364495bb2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c8a5d7c-ee1f-4a41-94e4-db31e85a398d", "external-id": "cl2-zone-613", "segmentation_id": 613, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e5c076-e1", "ovs_interfaceid": "54e5c076-e188-4834-a63f-99afcfff6137", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 997.587530] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:92:12:08', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4c8a5d7c-ee1f-4a41-94e4-db31e85a398d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '54e5c076-e188-4834-a63f-99afcfff6137', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 997.594829] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Creating folder: Project (0e5e8feb5e194fddb47dd8364495bb2b). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.595338] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b58023db-a970-4f96-a1cb-75eb8c844407 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.606843] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Created folder: Project (0e5e8feb5e194fddb47dd8364495bb2b) in parent group-v845475. [ 997.607027] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Creating folder: Instances. Parent ref: group-v845540. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.607232] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b17ef2f0-5f0a-40b7-9295-050e8145f678 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.615849] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Created folder: Instances in parent group-v845540. [ 997.616072] env[60175]: DEBUG oslo.service.loopingcall [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 997.616237] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 997.616407] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c99fb2eb-fceb-4c1b-b6ff-899211227250 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.634580] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 997.634580] env[60175]: value = "task-4292955" [ 997.634580] env[60175]: _type = "Task" [ 997.634580] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 997.641551] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292955, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.966065] env[60175]: DEBUG nova.compute.manager [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Received event network-vif-plugged-54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 997.966341] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Acquiring lock "a45c150e-942b-454a-ab59-aa6b191bfada-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 997.966551] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Lock "a45c150e-942b-454a-ab59-aa6b191bfada-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.966717] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Lock "a45c150e-942b-454a-ab59-aa6b191bfada-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.966880] env[60175]: DEBUG nova.compute.manager [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] No waiting events found dispatching network-vif-plugged-54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 997.967053] env[60175]: WARNING nova.compute.manager [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Received unexpected event network-vif-plugged-54e5c076-e188-4834-a63f-99afcfff6137 for instance with vm_state building and task_state spawning. [ 997.967216] env[60175]: DEBUG nova.compute.manager [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Received event network-changed-54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 997.967368] env[60175]: DEBUG nova.compute.manager [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Refreshing instance network info cache due to event network-changed-54e5c076-e188-4834-a63f-99afcfff6137. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 997.967543] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Acquiring lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 997.967674] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Acquired lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 997.967864] env[60175]: DEBUG nova.network.neutron [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Refreshing network info cache for port 54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 998.144660] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292955, 'name': CreateVM_Task, 'duration_secs': 0.267146} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 998.144782] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 998.145434] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 998.145597] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 998.145906] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 998.146196] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fafc1906-cae0-4b5e-afde-c15b7f9e3f80 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.152361] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Waiting for the task: (returnval){ [ 998.152361] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5252d128-0d0f-c78d-c5cd-636d565d14f8" [ 998.152361] env[60175]: _type = "Task" [ 998.152361] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 998.159553] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5252d128-0d0f-c78d-c5cd-636d565d14f8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 998.191122] env[60175]: DEBUG nova.network.neutron [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Updated VIF entry in instance network info cache for port 54e5c076-e188-4834-a63f-99afcfff6137. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 998.191463] env[60175]: DEBUG nova.network.neutron [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Updating instance_info_cache with network_info: [{"id": "54e5c076-e188-4834-a63f-99afcfff6137", "address": "fa:16:3e:92:12:08", "network": {"id": "ded1035f-0d77-432e-9b04-e59d293f8f45", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1709423783-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e5e8feb5e194fddb47dd8364495bb2b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4c8a5d7c-ee1f-4a41-94e4-db31e85a398d", "external-id": "cl2-zone-613", "segmentation_id": 613, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap54e5c076-e1", "ovs_interfaceid": "54e5c076-e188-4834-a63f-99afcfff6137", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 998.200332] env[60175]: DEBUG oslo_concurrency.lockutils [req-4db16eaf-7694-44f4-8eb1-41427717fdd9 req-86d7b827-1975-4f23-8202-b79c46905164 service nova] Releasing lock "refresh_cache-a45c150e-942b-454a-ab59-aa6b191bfada" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 998.662665] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 998.662999] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 998.663081] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1005.672429] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "67cfe7ba-4590-451b-9e1a-340977b597a4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1041.215885] env[60175]: WARNING oslo_vmware.rw_handles [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.215885] env[60175]: ERROR oslo_vmware.rw_handles [ 1041.216522] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1041.218578] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1041.218833] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Copying Virtual Disk [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/3439777d-a659-4048-a017-0d39b89068eb/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1041.219160] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-77fe8026-c64a-4bc6-a46c-062bdad6b246 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.228909] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for the task: (returnval){ [ 1041.228909] env[60175]: value = "task-4292956" [ 1041.228909] env[60175]: _type = "Task" [ 1041.228909] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.237476] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Task: {'id': task-4292956, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1041.739555] env[60175]: DEBUG oslo_vmware.exceptions [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1041.739806] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1041.740405] env[60175]: ERROR nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.740405] env[60175]: Faults: ['InvalidArgument'] [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Traceback (most recent call last): [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] yield resources [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self.driver.spawn(context, instance, image_meta, [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self._fetch_image_if_missing(context, vi) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] image_cache(vi, tmp_image_ds_loc) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] vm_util.copy_virtual_disk( [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] session._wait_for_task(vmdk_copy_task) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return self.wait_for_task(task_ref) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return evt.wait() [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] result = hub.switch() [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return self.greenlet.switch() [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self.f(*self.args, **self.kw) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] raise exceptions.translate_fault(task_info.error) [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Faults: ['InvalidArgument'] [ 1041.740405] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] [ 1041.741433] env[60175]: INFO nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Terminating instance [ 1041.742337] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1041.742539] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1041.742804] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b626ff0c-61c1-4533-9261-fe6399333a6f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.745265] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1041.745451] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1041.746171] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19697a0e-9158-44b4-9c64-ed80914c1681 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.752876] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1041.753135] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7ccaf1bf-4dbe-45a2-8785-066a7d179bb4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.755268] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1041.755434] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1041.756406] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-65853424-df7f-447d-831a-503ab1f14dad {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.761228] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for the task: (returnval){ [ 1041.761228] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]520b2779-c6c1-5d5a-b034-2680c32d7e43" [ 1041.761228] env[60175]: _type = "Task" [ 1041.761228] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.768691] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]520b2779-c6c1-5d5a-b034-2680c32d7e43, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1041.896312] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1041.896532] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1041.896707] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Deleting the datastore file [datastore2] 81af879b-3bc3-4aff-a99d-98d3aba73512 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1041.896969] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d1427066-e26f-496e-93ff-1c63264291ed {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.903109] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for the task: (returnval){ [ 1041.903109] env[60175]: value = "task-4292958" [ 1041.903109] env[60175]: _type = "Task" [ 1041.903109] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.910559] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Task: {'id': task-4292958, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.271430] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1042.271815] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Creating directory with path [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1042.271924] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f60d3246-47da-4f03-9a0b-dd48513fd918 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.282902] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Created directory with path [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1042.283093] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Fetch image to [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1042.283263] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1042.283994] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef78416-106e-41d3-9e67-4b80bb8e0885 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.290616] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a80cec1-1845-450f-b7e6-ac7db61895bb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.299167] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3ba5bb1-1f6c-4a56-8914-b24d47a2292c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.330516] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b4fc446-c55e-41ed-b9be-c8fbc329f2b7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.336425] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa6c2bb6-cee5-4d1b-85d9-916601926c13 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.361276] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1042.407858] env[60175]: DEBUG oslo_vmware.rw_handles [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1042.463599] env[60175]: DEBUG oslo_vmware.api [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Task: {'id': task-4292958, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069778} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1042.464929] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1042.465137] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1042.465310] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1042.465482] env[60175]: INFO nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Took 0.72 seconds to destroy the instance on the hypervisor. [ 1042.467320] env[60175]: DEBUG oslo_vmware.rw_handles [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1042.467480] env[60175]: DEBUG oslo_vmware.rw_handles [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1042.468041] env[60175]: DEBUG nova.compute.claims [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1042.468209] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1042.468418] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1042.586602] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-995599f8-d905-464e-ae0e-518b2f7cf0b6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.593734] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95437ea3-7bf3-4485-b47e-c95c091bc955 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.622470] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ea0748f-26e6-4a8d-bc74-0967d40e3624 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.629106] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3a7b5f9-44f7-4b01-8140-056aaae5f0bc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.642776] env[60175]: DEBUG nova.compute.provider_tree [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1042.650914] env[60175]: DEBUG nova.scheduler.client.report [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1042.663649] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.195s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1042.664190] env[60175]: ERROR nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.664190] env[60175]: Faults: ['InvalidArgument'] [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Traceback (most recent call last): [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self.driver.spawn(context, instance, image_meta, [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self._fetch_image_if_missing(context, vi) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] image_cache(vi, tmp_image_ds_loc) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] vm_util.copy_virtual_disk( [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] session._wait_for_task(vmdk_copy_task) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return self.wait_for_task(task_ref) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return evt.wait() [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] result = hub.switch() [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] return self.greenlet.switch() [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] self.f(*self.args, **self.kw) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] raise exceptions.translate_fault(task_info.error) [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Faults: ['InvalidArgument'] [ 1042.664190] env[60175]: ERROR nova.compute.manager [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] [ 1042.665067] env[60175]: DEBUG nova.compute.utils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1042.666227] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Build of instance 81af879b-3bc3-4aff-a99d-98d3aba73512 was re-scheduled: A specified parameter was not correct: fileType [ 1042.666227] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1042.666596] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1042.666764] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1042.666930] env[60175]: DEBUG nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1042.667099] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1042.932388] env[60175]: DEBUG nova.network.neutron [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1042.941913] env[60175]: INFO nova.compute.manager [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Took 0.27 seconds to deallocate network for instance. [ 1043.025369] env[60175]: INFO nova.scheduler.client.report [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Deleted allocations for instance 81af879b-3bc3-4aff-a99d-98d3aba73512 [ 1043.042977] env[60175]: DEBUG oslo_concurrency.lockutils [None req-4709a801-3325-4dca-afa6-685ba47107e6 tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 470.886s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.044053] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 273.128s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.044479] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Acquiring lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.044479] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.044635] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.046597] env[60175]: INFO nova.compute.manager [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Terminating instance [ 1043.048152] env[60175]: DEBUG nova.compute.manager [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1043.048342] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1043.048787] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5e19d44a-d3e9-46d6-a8f4-7861ccd27c2e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.057921] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40baf80-be8b-4455-a0d7-93ae5ab27419 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.068764] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1043.088178] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 81af879b-3bc3-4aff-a99d-98d3aba73512 could not be found. [ 1043.088450] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1043.088689] env[60175]: INFO nova.compute.manager [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1043.088941] env[60175]: DEBUG oslo.service.loopingcall [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1043.089161] env[60175]: DEBUG nova.compute.manager [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1043.089272] env[60175]: DEBUG nova.network.neutron [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.113912] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.114163] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.115559] env[60175]: INFO nova.compute.claims [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1043.117900] env[60175]: DEBUG nova.network.neutron [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.125556] env[60175]: INFO nova.compute.manager [-] [instance: 81af879b-3bc3-4aff-a99d-98d3aba73512] Took 0.04 seconds to deallocate network for instance. [ 1043.222978] env[60175]: DEBUG oslo_concurrency.lockutils [None req-e2810962-e3e0-45eb-9375-48d8d742f0bb tempest-VolumesAssistedSnapshotsTest-48868217 tempest-VolumesAssistedSnapshotsTest-48868217-project-member] Lock "81af879b-3bc3-4aff-a99d-98d3aba73512" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.238170] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610bd8d3-34e0-4429-a910-75015ebdb24c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.245409] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe7a6ae6-c004-4ffb-83fe-96986c6466bc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.275976] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c5a9f8b-43d1-42dd-bbd0-22bd7990be9e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.282573] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36467b37-8e93-4431-a8a5-6c610908af46 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.295130] env[60175]: DEBUG nova.compute.provider_tree [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1043.303862] env[60175]: DEBUG nova.scheduler.client.report [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1043.315693] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.316141] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1043.347722] env[60175]: DEBUG nova.compute.utils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1043.349112] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1043.349281] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1043.357131] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1043.415295] env[60175]: DEBUG nova.policy [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4930bb1ad0cc4376a388847b3238dded', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed721a0a42ee43fba6f37868594bffec', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 1043.417950] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:54:28Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='90f538cf-6d60-4d52-9dd3-646d48ecc3f2',id=37,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-999581241',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1043.439696] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1043.440010] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1043.440051] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1043.441055] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1043.441055] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1043.441055] env[60175]: DEBUG nova.virt.hardware [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1043.441966] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7164076a-6b4c-4e94-a3b4-82fbd1e98a27 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.448987] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89844f41-c62c-4208-bc7e-61bdf3c40312 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.808868] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Successfully created port: f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1044.394166] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Successfully updated port: f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1044.403434] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1044.403584] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1044.403766] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1044.439247] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1044.587502] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Updating instance_info_cache with network_info: [{"id": "f87176c0-1159-456d-9ea8-4227763025cd", "address": "fa:16:3e:a1:7b:90", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.135", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf87176c0-11", "ovs_interfaceid": "f87176c0-1159-456d-9ea8-4227763025cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1044.597871] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1044.598200] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance network_info: |[{"id": "f87176c0-1159-456d-9ea8-4227763025cd", "address": "fa:16:3e:a1:7b:90", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.135", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf87176c0-11", "ovs_interfaceid": "f87176c0-1159-456d-9ea8-4227763025cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1044.598576] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a1:7b:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '537e0890-4fa2-4f2d-b74c-49933a4edf53', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f87176c0-1159-456d-9ea8-4227763025cd', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1044.606130] env[60175]: DEBUG oslo.service.loopingcall [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1044.606683] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1044.606999] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-49205e98-d1ab-4df5-99e8-fa30a94cacf8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1044.626766] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1044.626766] env[60175]: value = "task-4292959" [ 1044.626766] env[60175]: _type = "Task" [ 1044.626766] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1044.634411] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292959, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1044.963860] env[60175]: DEBUG nova.compute.manager [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Received event network-vif-plugged-f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1044.964112] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Acquiring lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1044.964299] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1044.964460] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1044.964621] env[60175]: DEBUG nova.compute.manager [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] No waiting events found dispatching network-vif-plugged-f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1044.964778] env[60175]: WARNING nova.compute.manager [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Received unexpected event network-vif-plugged-f87176c0-1159-456d-9ea8-4227763025cd for instance with vm_state building and task_state spawning. [ 1044.964928] env[60175]: DEBUG nova.compute.manager [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Received event network-changed-f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1044.965132] env[60175]: DEBUG nova.compute.manager [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Refreshing instance network info cache due to event network-changed-f87176c0-1159-456d-9ea8-4227763025cd. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1044.965324] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Acquiring lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1044.965453] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Acquired lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1044.965600] env[60175]: DEBUG nova.network.neutron [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Refreshing network info cache for port f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1045.136322] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292959, 'name': CreateVM_Task, 'duration_secs': 0.273878} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1045.136488] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1045.137200] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1045.137358] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1045.137661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1045.137891] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a648663-488f-43d4-ad64-69bd7228784a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.142107] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 1045.142107] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]520ce139-d1f1-d4dd-3be4-ab2f6930f007" [ 1045.142107] env[60175]: _type = "Task" [ 1045.142107] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1045.151547] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]520ce139-d1f1-d4dd-3be4-ab2f6930f007, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1045.190861] env[60175]: DEBUG nova.network.neutron [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Updated VIF entry in instance network info cache for port f87176c0-1159-456d-9ea8-4227763025cd. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1045.191234] env[60175]: DEBUG nova.network.neutron [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Updating instance_info_cache with network_info: [{"id": "f87176c0-1159-456d-9ea8-4227763025cd", "address": "fa:16:3e:a1:7b:90", "network": {"id": "82dbb7c5-5415-40d9-a827-1ad8c21abaad", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.135", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "615f5638ac394d9090feb5ebdacc55aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "537e0890-4fa2-4f2d-b74c-49933a4edf53", "external-id": "nsx-vlan-transportzone-82", "segmentation_id": 82, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf87176c0-11", "ovs_interfaceid": "f87176c0-1159-456d-9ea8-4227763025cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.200314] env[60175]: DEBUG oslo_concurrency.lockutils [req-b51b5599-2f05-4373-93ba-ec64ca0708b5 req-848b6239-8ac8-4090-ba73-751eda4d438c service nova] Releasing lock "refresh_cache-8f4635d8-5789-4402-8ca2-543b4d4dfc76" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1045.654560] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1045.654909] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1045.654974] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1047.949472] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1047.959177] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1047.959435] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1047.959653] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1047.959843] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1047.960928] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3a0bb9-7d32-4df5-878d-ff0a1023561d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.969670] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b153075f-f99b-4f0c-bf99-b0058aeabce9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.983471] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7229504b-09e7-405d-a97a-72394aedea9d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.989912] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d68eed5-ac33-4020-a1a4-8ebd7188d466 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.037119] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180726MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1048.037344] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.037619] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.086045] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.086218] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.086345] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.086462] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance a45c150e-942b-454a-ab59-aa6b191bfada actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.086577] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.086751] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1048.086888] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=149GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1048.154563] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6541570b-8f24-41b0-acbb-678b04d5b20d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.161923] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae9b2b4b-290d-4957-aaa4-d6f75913ce50 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.190712] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c643c71-af9d-420f-9bcd-08722b26bbc3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.197466] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9db1983-8fd1-4fb3-9c41-4e298bb71f47 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.211194] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1048.218919] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1048.231929] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1048.232119] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1050.233074] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1050.233074] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1050.233074] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1050.247442] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1050.247584] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1050.247706] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1050.247836] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1050.248016] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1050.248191] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1050.248594] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1050.949865] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1050.950137] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.949754] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.950035] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1052.950157] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1053.951351] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.950653] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1069.653356] env[60175]: DEBUG nova.compute.manager [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-vif-deleted-dd85cf77-b1fd-4be9-8b53-49cd7b671dfd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1069.653656] env[60175]: INFO nova.compute.manager [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Neutron deleted interface dd85cf77-b1fd-4be9-8b53-49cd7b671dfd; detaching it from the instance and deleting it from the info cache [ 1069.655038] env[60175]: DEBUG nova.network.neutron [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updating instance_info_cache with network_info: [{"id": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "address": "fa:16:3e:8d:2e:06", "network": {"id": "35d93d8e-c794-4369-818a-56c90c185516", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1854994814", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.137", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "027d599d310b4abf9ce371b09bb3253b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "557aba95-8968-407a-bac2-2fae66f7c8e5", "external-id": "nsx-vlan-transportzone-45", "segmentation_id": 45, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb80456f-ed", "ovs_interfaceid": "bb80456f-edd5-48f8-90d1-2263c6b3e6fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1069.665471] env[60175]: DEBUG oslo_concurrency.lockutils [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] Acquiring lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1071.687300] env[60175]: DEBUG nova.compute.manager [req-cfdbe7df-fc05-440b-8f22-bd56cb343298 req-11b1546c-25ed-4b98-b0c3-00858c1904c6 service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Received event network-vif-deleted-bb80456f-edd5-48f8-90d1-2263c6b3e6fa {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1075.447917] env[60175]: DEBUG nova.compute.manager [req-9936f2bd-9e82-4dc8-aa02-d3dd5897c2b2 req-b24ce444-86aa-4fab-b8c3-24a5407dda64 service nova] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Received event network-vif-deleted-54e5c076-e188-4834-a63f-99afcfff6137 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1091.232218] env[60175]: WARNING oslo_vmware.rw_handles [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1091.232218] env[60175]: ERROR oslo_vmware.rw_handles [ 1091.232893] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1091.234662] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1091.234895] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Copying Virtual Disk [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/e94276c8-8b55-4f82-a3f8-7529a6d74fa2/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1091.235201] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7116b356-e5be-4786-9f42-c088ec36b033 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.243739] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for the task: (returnval){ [ 1091.243739] env[60175]: value = "task-4292960" [ 1091.243739] env[60175]: _type = "Task" [ 1091.243739] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1091.251450] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Task: {'id': task-4292960, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.754449] env[60175]: DEBUG oslo_vmware.exceptions [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1091.754661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1091.755230] env[60175]: ERROR nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1091.755230] env[60175]: Faults: ['InvalidArgument'] [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Traceback (most recent call last): [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] yield resources [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self.driver.spawn(context, instance, image_meta, [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self._fetch_image_if_missing(context, vi) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] image_cache(vi, tmp_image_ds_loc) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] vm_util.copy_virtual_disk( [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] session._wait_for_task(vmdk_copy_task) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return self.wait_for_task(task_ref) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return evt.wait() [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] result = hub.switch() [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return self.greenlet.switch() [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self.f(*self.args, **self.kw) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] raise exceptions.translate_fault(task_info.error) [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Faults: ['InvalidArgument'] [ 1091.755230] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] [ 1091.756221] env[60175]: INFO nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Terminating instance [ 1091.757156] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1091.757370] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1091.757987] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1091.758186] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1091.758393] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-22833d63-6baa-49b7-bcd1-44465e192d8f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.760754] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe46db3-55cc-499c-8e7f-a12539dd7265 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.767395] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1091.767604] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4d765dd5-6853-48f3-86ef-f9455ccbc9f7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.769621] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1091.769786] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1091.770715] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f426bd36-2c45-4f6a-8383-532b18676318 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.776624] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Waiting for the task: (returnval){ [ 1091.776624] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]524537ae-0117-4a46-16a3-bc00fa56e6b2" [ 1091.776624] env[60175]: _type = "Task" [ 1091.776624] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1091.783737] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]524537ae-0117-4a46-16a3-bc00fa56e6b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1091.841975] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1091.842280] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1091.842430] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Deleting the datastore file [datastore2] 843d4db6-c1fb-4b74-ad3c-779e309a170e {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1091.842681] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-13fe9d6c-b9f7-409f-af3a-d1410531a037 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.848057] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for the task: (returnval){ [ 1091.848057] env[60175]: value = "task-4292962" [ 1091.848057] env[60175]: _type = "Task" [ 1091.848057] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1091.855542] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Task: {'id': task-4292962, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.287640] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1092.287895] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Creating directory with path [datastore2] vmware_temp/27090fb9-0f1f-4517-aa42-7abd55962958/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.288207] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cd68d6c5-5715-4333-ac37-b2aedd8e4939 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.299420] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Created directory with path [datastore2] vmware_temp/27090fb9-0f1f-4517-aa42-7abd55962958/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.299611] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Fetch image to [datastore2] vmware_temp/27090fb9-0f1f-4517-aa42-7abd55962958/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1092.299775] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/27090fb9-0f1f-4517-aa42-7abd55962958/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1092.300494] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d5cf43-ee87-4c3e-a70e-889037b74451 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.311527] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b134fa1-151f-4bb2-9e9d-8a76f142d499 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.320224] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c110a82-ea27-4389-ad55-23adb72f3bfe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.352406] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd399a4d-27ec-4a54-8557-14bb9878240d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.359372] env[60175]: DEBUG oslo_vmware.api [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Task: {'id': task-4292962, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.0625} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1092.360776] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1092.360965] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1092.361147] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1092.361316] env[60175]: INFO nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1092.363052] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f012add3-7747-45b2-aa63-3c8491aae399 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.364908] env[60175]: DEBUG nova.compute.claims [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1092.365084] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1092.365298] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1092.385914] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1092.457628] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a17da273-5539-4afb-af8e-3d5210731e17 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.464849] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7c0cea5-d2c3-4c57-a446-2235d102cd58 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.499966] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda6daa8-56db-45ec-b7b7-d5f36f3dfa7b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.507097] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f47cb71-60b1-4048-b443-03eb15b9c83a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.520264] env[60175]: DEBUG nova.compute.provider_tree [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1092.528889] env[60175]: DEBUG nova.scheduler.client.report [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1092.541746] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.176s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.542333] env[60175]: ERROR nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.542333] env[60175]: Faults: ['InvalidArgument'] [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Traceback (most recent call last): [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self.driver.spawn(context, instance, image_meta, [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self._fetch_image_if_missing(context, vi) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] image_cache(vi, tmp_image_ds_loc) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] vm_util.copy_virtual_disk( [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] session._wait_for_task(vmdk_copy_task) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return self.wait_for_task(task_ref) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return evt.wait() [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] result = hub.switch() [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] return self.greenlet.switch() [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] self.f(*self.args, **self.kw) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] raise exceptions.translate_fault(task_info.error) [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Faults: ['InvalidArgument'] [ 1092.542333] env[60175]: ERROR nova.compute.manager [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] [ 1092.543834] env[60175]: DEBUG nova.compute.utils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1092.544362] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Build of instance 843d4db6-c1fb-4b74-ad3c-779e309a170e was re-scheduled: A specified parameter was not correct: fileType [ 1092.544362] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1092.544726] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1092.544949] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1092.545067] env[60175]: DEBUG nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1092.545237] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1092.620327] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1092.621916] env[60175]: ERROR nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] result = getattr(controller, method)(*args, **kwargs) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._get(image_id) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] resp, body = self.http_client.get(url, headers=header) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.request(url, 'GET', **kwargs) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._handle_response(resp) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exc.from_response(resp, resp.content) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] During handling of the above exception, another exception occurred: [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] yield resources [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.driver.spawn(context, instance, image_meta, [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._fetch_image_if_missing(context, vi) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image_fetch(context, vi, tmp_image_ds_loc) [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] images.fetch_image( [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1092.621916] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] metadata = IMAGE_API.get(context, image_ref) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return session.show(context, image_id, [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] _reraise_translated_image_exception(image_id) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise new_exc.with_traceback(exc_trace) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] result = getattr(controller, method)(*args, **kwargs) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._get(image_id) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] resp, body = self.http_client.get(url, headers=header) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.request(url, 'GET', **kwargs) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._handle_response(resp) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exc.from_response(resp, resp.content) [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1092.623105] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1092.623105] env[60175]: INFO nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Terminating instance [ 1092.624210] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1092.624485] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.624690] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fc06d1fb-7f0e-4d17-a3d0-dd04391173ce {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.627137] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1092.627331] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.628109] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-062baa18-c891-4378-a25d-4585be2b4b3e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.635606] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1092.635829] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3739200e-1fb5-48dd-afb0-4b62ff5bb597 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.638083] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.638165] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1092.639101] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f12e625-d478-4548-8ee8-ac5e87bca100 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.643672] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Waiting for the task: (returnval){ [ 1092.643672] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]523bf778-4edf-bc5d-309b-8637b69a7254" [ 1092.643672] env[60175]: _type = "Task" [ 1092.643672] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.650747] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]523bf778-4edf-bc5d-309b-8637b69a7254, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.703472] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1092.703472] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1092.703660] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Deleting the datastore file [datastore2] 029d2099-2e55-4632-81b6-b59d6a20faab {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1092.703854] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4fdac23e-d172-44a8-a032-35e590f26fd7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.709531] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Waiting for the task: (returnval){ [ 1092.709531] env[60175]: value = "task-4292964" [ 1092.709531] env[60175]: _type = "Task" [ 1092.709531] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.717501] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Task: {'id': task-4292964, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.895971] env[60175]: DEBUG nova.network.neutron [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1092.906893] env[60175]: INFO nova.compute.manager [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Took 0.36 seconds to deallocate network for instance. [ 1092.995518] env[60175]: INFO nova.scheduler.client.report [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Deleted allocations for instance 843d4db6-c1fb-4b74-ad3c-779e309a170e [ 1093.013131] env[60175]: DEBUG oslo_concurrency.lockutils [None req-bf1e476b-43f6-456b-b824-48dd7015c5a8 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 520.702s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.013472] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 323.240s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.013789] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Acquiring lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.014108] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.014334] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.016958] env[60175]: INFO nova.compute.manager [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Terminating instance [ 1093.019188] env[60175]: DEBUG nova.compute.manager [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1093.019450] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1093.020061] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9a77acaf-6e12-4ec8-9e6c-86be5b3fc366 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.033144] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fc2eea-03e2-4832-a025-a3765cd84186 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.067504] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 843d4db6-c1fb-4b74-ad3c-779e309a170e could not be found. [ 1093.067728] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.067858] env[60175]: INFO nova.compute.manager [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1093.068105] env[60175]: DEBUG oslo.service.loopingcall [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1093.068329] env[60175]: DEBUG nova.compute.manager [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1093.068425] env[60175]: DEBUG nova.network.neutron [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.090819] env[60175]: DEBUG nova.network.neutron [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1093.100390] env[60175]: INFO nova.compute.manager [-] [instance: 843d4db6-c1fb-4b74-ad3c-779e309a170e] Took 0.03 seconds to deallocate network for instance. [ 1093.154157] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1093.154427] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Creating directory with path [datastore2] vmware_temp/911f2c49-f543-4061-adfd-5a4842e56c84/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.154651] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-630e4e96-e677-4af1-9dbc-3d376560e44a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.165989] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Created directory with path [datastore2] vmware_temp/911f2c49-f543-4061-adfd-5a4842e56c84/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.166192] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Fetch image to [datastore2] vmware_temp/911f2c49-f543-4061-adfd-5a4842e56c84/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1093.166357] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/911f2c49-f543-4061-adfd-5a4842e56c84/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1093.167073] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88b4402e-b724-4b99-b257-ce5d74f3f006 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.174047] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1185300e-5a4a-45c3-8151-79ccf2942137 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.179748] env[60175]: DEBUG oslo_concurrency.lockutils [None req-47af5207-318a-4aa5-8d4a-48d0852501e3 tempest-FloatingIPsAssociationTestJSON-1720060526 tempest-FloatingIPsAssociationTestJSON-1720060526-project-member] Lock "843d4db6-c1fb-4b74-ad3c-779e309a170e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.186554] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8845859e-5b0d-4a46-a8a5-a62c9fa83be2 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.221933] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbdb9635-f297-44fd-be59-36d350d3681c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.230727] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7991d4c6-0036-4952-b992-98cd8161db9d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.232380] env[60175]: DEBUG oslo_vmware.api [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Task: {'id': task-4292964, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077291} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1093.232805] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1093.232994] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1093.233196] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.233519] env[60175]: INFO nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1093.235329] env[60175]: DEBUG nova.compute.claims [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1093.235527] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.235739] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.258624] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1093.261782] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.262393] env[60175]: DEBUG nova.compute.utils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance 029d2099-2e55-4632-81b6-b59d6a20faab could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1093.263749] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1093.263913] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1093.264112] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1093.264301] env[60175]: DEBUG nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1093.264460] env[60175]: DEBUG nova.network.neutron [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.290882] env[60175]: DEBUG neutronclient.v2_0.client [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1093.294864] env[60175]: ERROR nova.compute.manager [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] result = getattr(controller, method)(*args, **kwargs) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._get(image_id) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] resp, body = self.http_client.get(url, headers=header) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.request(url, 'GET', **kwargs) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._handle_response(resp) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exc.from_response(resp, resp.content) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] During handling of the above exception, another exception occurred: [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.driver.spawn(context, instance, image_meta, [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._fetch_image_if_missing(context, vi) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image_fetch(context, vi, tmp_image_ds_loc) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] images.fetch_image( [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] metadata = IMAGE_API.get(context, image_ref) [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return session.show(context, image_id, [ 1093.294864] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] _reraise_translated_image_exception(image_id) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise new_exc.with_traceback(exc_trace) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] result = getattr(controller, method)(*args, **kwargs) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._get(image_id) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] resp, body = self.http_client.get(url, headers=header) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.request(url, 'GET', **kwargs) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._handle_response(resp) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exc.from_response(resp, resp.content) [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] During handling of the above exception, another exception occurred: [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._build_and_run_instance(context, instance, image, [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] with excutils.save_and_reraise_exception(): [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.force_reraise() [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise self.value [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] with self.rt.instance_claim(context, instance, node, allocs, [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.abort() [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1093.296015] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return f(*args, **kwargs) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._unset_instance_host_and_node(instance) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] instance.save() [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] updates, result = self.indirection_api.object_action( [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return cctxt.call(context, 'object_action', objinst=objinst, [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] result = self.transport._send( [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._driver.send(target, ctxt, message, [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise result [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] nova.exception_Remote.InstanceNotFound_Remote: Instance 029d2099-2e55-4632-81b6-b59d6a20faab could not be found. [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return getattr(target, method)(*args, **kwargs) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return fn(self, *args, **kwargs) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] old_ref, inst_ref = db.instance_update_and_get_original( [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return f(*args, **kwargs) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] with excutils.save_and_reraise_exception() as ectxt: [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.force_reraise() [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise self.value [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return f(*args, **kwargs) [ 1093.297180] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return f(context, *args, **kwargs) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exception.InstanceNotFound(instance_id=uuid) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] nova.exception.InstanceNotFound: Instance 029d2099-2e55-4632-81b6-b59d6a20faab could not be found. [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] During handling of the above exception, another exception occurred: [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] exception_handler_v20(status_code, error_body) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise client_exc(message=error_message, [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Neutron server returns request_ids: ['req-4a7ecddc-9ee4-4c08-bd1a-19a12231caf3'] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] During handling of the above exception, another exception occurred: [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Traceback (most recent call last): [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._deallocate_network(context, instance, requested_networks) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self.network_api.deallocate_for_instance( [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] data = neutron.list_ports(**search_opts) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.list('ports', self.ports_path, retrieve_all, [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1093.298915] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] for r in self._pagination(collection, path, **params): [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] res = self.get(path, params=params) [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.retry_request("GET", action, body=body, [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] return self.do_request(method, action, body=body, [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] ret = obj(*args, **kwargs) [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] self._handle_fault_response(status_code, replybody, resp) [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] raise exception.Unauthorized() [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] nova.exception.Unauthorized: Not authorized. [ 1093.300388] env[60175]: ERROR nova.compute.manager [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] [ 1093.319588] env[60175]: DEBUG oslo_concurrency.lockutils [None req-8b4617f1-9a6e-460e-8273-186e336d6a2f tempest-AttachVolumeNegativeTest-895404657 tempest-AttachVolumeNegativeTest-895404657-project-member] Lock "029d2099-2e55-4632-81b6-b59d6a20faab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 457.280s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.358806] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1093.359568] env[60175]: ERROR nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] result = getattr(controller, method)(*args, **kwargs) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._get(image_id) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] resp, body = self.http_client.get(url, headers=header) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.request(url, 'GET', **kwargs) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._handle_response(resp) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exc.from_response(resp, resp.content) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] During handling of the above exception, another exception occurred: [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] yield resources [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.driver.spawn(context, instance, image_meta, [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._fetch_image_if_missing(context, vi) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image_fetch(context, vi, tmp_image_ds_loc) [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] images.fetch_image( [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1093.359568] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] metadata = IMAGE_API.get(context, image_ref) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return session.show(context, image_id, [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] _reraise_translated_image_exception(image_id) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise new_exc.with_traceback(exc_trace) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] result = getattr(controller, method)(*args, **kwargs) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._get(image_id) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] resp, body = self.http_client.get(url, headers=header) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.request(url, 'GET', **kwargs) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._handle_response(resp) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exc.from_response(resp, resp.content) [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1093.360778] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1093.360778] env[60175]: INFO nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Terminating instance [ 1093.362156] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1093.362369] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.362978] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1093.363203] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1093.363431] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99c531cb-2b22-43f1-8b75-aa9811eaad73 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.365949] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eedf1cab-4482-4a73-b883-9c4e17eccc48 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.373112] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1093.373321] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-705c9fd6-693a-4fc6-af33-6a4dd1d33c3e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.375483] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.375653] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1093.376579] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7831436e-90ee-4dc7-a374-7f06016c61fc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.382016] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Waiting for the task: (returnval){ [ 1093.382016] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52493b96-6eb2-4f64-62e7-bced1c7a9cf8" [ 1093.382016] env[60175]: _type = "Task" [ 1093.382016] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.389242] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52493b96-6eb2-4f64-62e7-bced1c7a9cf8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.444772] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1093.445036] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1093.445256] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Deleting the datastore file [datastore2] 068814dd-328c-48d1-b514-34eb43b0f2b1 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1093.445547] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f544c56a-d648-43c1-97d3-2817f8b8076c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.451983] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Waiting for the task: (returnval){ [ 1093.451983] env[60175]: value = "task-4292966" [ 1093.451983] env[60175]: _type = "Task" [ 1093.451983] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.459516] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Task: {'id': task-4292966, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.892558] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1093.892803] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Creating directory with path [datastore2] vmware_temp/d4fac43b-2029-43dc-92ca-ad3a20200ae8/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.893088] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-843a7506-2416-4df8-9a3d-3bf589386378 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.904081] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Created directory with path [datastore2] vmware_temp/d4fac43b-2029-43dc-92ca-ad3a20200ae8/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.904286] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Fetch image to [datastore2] vmware_temp/d4fac43b-2029-43dc-92ca-ad3a20200ae8/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1093.904442] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/d4fac43b-2029-43dc-92ca-ad3a20200ae8/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1093.905129] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e739dce9-5ec1-4b3b-aa62-dc0ac281d732 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.911573] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa156dbb-55f1-456e-89a0-ab961c907912 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.920942] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39384843-bdc6-4f4e-9be2-76b9171446e0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.956289] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a9c927e-645a-4ca2-8276-9f597889c7d1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.964457] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2eb4ee38-cb0e-476e-aba4-ea039446665a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.966096] env[60175]: DEBUG oslo_vmware.api [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Task: {'id': task-4292966, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065796} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1093.966324] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1093.966498] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1093.966659] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.966824] env[60175]: INFO nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1093.968900] env[60175]: DEBUG nova.compute.claims [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1093.969074] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.969292] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.985613] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1094.002932] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.003752] env[60175]: DEBUG nova.compute.utils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1094.005397] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1094.005598] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1094.005765] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1094.005928] env[60175]: DEBUG nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1094.006106] env[60175]: DEBUG nova.network.neutron [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1094.031888] env[60175]: DEBUG neutronclient.v2_0.client [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1094.033555] env[60175]: ERROR nova.compute.manager [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] result = getattr(controller, method)(*args, **kwargs) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._get(image_id) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] resp, body = self.http_client.get(url, headers=header) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.request(url, 'GET', **kwargs) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._handle_response(resp) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exc.from_response(resp, resp.content) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] During handling of the above exception, another exception occurred: [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.driver.spawn(context, instance, image_meta, [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._fetch_image_if_missing(context, vi) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] images.fetch_image( [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] metadata = IMAGE_API.get(context, image_ref) [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return session.show(context, image_id, [ 1094.033555] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] _reraise_translated_image_exception(image_id) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise new_exc.with_traceback(exc_trace) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] result = getattr(controller, method)(*args, **kwargs) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._get(image_id) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] resp, body = self.http_client.get(url, headers=header) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.request(url, 'GET', **kwargs) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._handle_response(resp) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exc.from_response(resp, resp.content) [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] During handling of the above exception, another exception occurred: [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._build_and_run_instance(context, instance, image, [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] with excutils.save_and_reraise_exception(): [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.force_reraise() [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise self.value [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] with self.rt.instance_claim(context, instance, node, allocs, [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.abort() [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1094.034766] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return f(*args, **kwargs) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._unset_instance_host_and_node(instance) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] instance.save() [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] updates, result = self.indirection_api.object_action( [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return cctxt.call(context, 'object_action', objinst=objinst, [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] result = self.transport._send( [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._driver.send(target, ctxt, message, [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise result [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] nova.exception_Remote.InstanceNotFound_Remote: Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 could not be found. [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return getattr(target, method)(*args, **kwargs) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return fn(self, *args, **kwargs) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] old_ref, inst_ref = db.instance_update_and_get_original( [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return f(*args, **kwargs) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] with excutils.save_and_reraise_exception() as ectxt: [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.force_reraise() [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise self.value [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return f(*args, **kwargs) [ 1094.035968] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return f(context, *args, **kwargs) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exception.InstanceNotFound(instance_id=uuid) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] nova.exception.InstanceNotFound: Instance 068814dd-328c-48d1-b514-34eb43b0f2b1 could not be found. [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] During handling of the above exception, another exception occurred: [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] exception_handler_v20(status_code, error_body) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise client_exc(message=error_message, [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Neutron server returns request_ids: ['req-9c6c7144-6e43-486d-b560-33cd371e8711'] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] During handling of the above exception, another exception occurred: [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Traceback (most recent call last): [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._deallocate_network(context, instance, requested_networks) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self.network_api.deallocate_for_instance( [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] data = neutron.list_ports(**search_opts) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.list('ports', self.ports_path, retrieve_all, [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1094.037192] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] for r in self._pagination(collection, path, **params): [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] res = self.get(path, params=params) [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.retry_request("GET", action, body=body, [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] return self.do_request(method, action, body=body, [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] ret = obj(*args, **kwargs) [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] self._handle_fault_response(status_code, replybody, resp) [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] raise exception.Unauthorized() [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] nova.exception.Unauthorized: Not authorized. [ 1094.038342] env[60175]: ERROR nova.compute.manager [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] [ 1094.055576] env[60175]: DEBUG oslo_concurrency.lockutils [None req-b31fc1cf-5d9f-4eaf-854f-7c8f3ca32f3b tempest-ServersTestJSON-97606219 tempest-ServersTestJSON-97606219-project-member] Lock "068814dd-328c-48d1-b514-34eb43b0f2b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 456.066s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.085639] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1094.086482] env[60175]: ERROR nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] result = getattr(controller, method)(*args, **kwargs) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._get(image_id) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] resp, body = self.http_client.get(url, headers=header) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.request(url, 'GET', **kwargs) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._handle_response(resp) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exc.from_response(resp, resp.content) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] During handling of the above exception, another exception occurred: [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] yield resources [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.driver.spawn(context, instance, image_meta, [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._fetch_image_if_missing(context, vi) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] images.fetch_image( [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.086482] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] metadata = IMAGE_API.get(context, image_ref) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return session.show(context, image_id, [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] _reraise_translated_image_exception(image_id) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise new_exc.with_traceback(exc_trace) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] result = getattr(controller, method)(*args, **kwargs) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._get(image_id) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] resp, body = self.http_client.get(url, headers=header) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.request(url, 'GET', **kwargs) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._handle_response(resp) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exc.from_response(resp, resp.content) [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.087592] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.087592] env[60175]: INFO nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Terminating instance [ 1094.088377] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.088545] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.089169] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1094.089358] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1094.089587] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29d1f46c-893e-4426-974d-86241aaed1d8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.092664] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7437ae67-f215-4690-a1f7-c0cd11eb6604 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.100050] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1094.101092] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2cf3d4ab-c7e8-4b3b-a1ec-0416c6f1eaa3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.102483] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.102649] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1094.103339] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c9dda79-d058-43e5-862c-a5ef4bb4507b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.108728] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Waiting for the task: (returnval){ [ 1094.108728] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52589ced-5c27-6b81-b804-9cc0e164acc8" [ 1094.108728] env[60175]: _type = "Task" [ 1094.108728] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.119968] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52589ced-5c27-6b81-b804-9cc0e164acc8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.163191] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1094.163433] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1094.163613] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Deleting the datastore file [datastore2] 500d78f9-ee0c-4620-9936-1a9b4f4fc09a {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1094.163874] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-86ae6fd2-1ff5-4ba4-8dda-cdfc83508fbd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.171208] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Waiting for the task: (returnval){ [ 1094.171208] env[60175]: value = "task-4292968" [ 1094.171208] env[60175]: _type = "Task" [ 1094.171208] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.180922] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Task: {'id': task-4292968, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.619116] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1094.619476] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Creating directory with path [datastore2] vmware_temp/235ade2f-0363-4568-89e8-f97d047ef843/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.619651] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b7453e7-f0a9-4f3f-82bf-a7418012ea94 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.630457] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Created directory with path [datastore2] vmware_temp/235ade2f-0363-4568-89e8-f97d047ef843/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.630691] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Fetch image to [datastore2] vmware_temp/235ade2f-0363-4568-89e8-f97d047ef843/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1094.630905] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/235ade2f-0363-4568-89e8-f97d047ef843/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1094.631656] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33390c3-d7ec-467d-b4c2-7d1373f69817 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.638084] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc41f8d4-dcf5-4b3b-9527-25e37de72d43 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.646736] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69f63364-1705-469e-be64-bdc20172ebdb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.680798] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75ed385c-f89b-4fcc-ae7e-a1934e7a7873 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.687540] env[60175]: DEBUG oslo_vmware.api [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Task: {'id': task-4292968, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076896} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1094.688948] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1094.689209] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1094.689447] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1094.689660] env[60175]: INFO nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1094.691412] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-92e1983b-2717-4d3b-af9c-63c433c01a2c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.693311] env[60175]: DEBUG nova.compute.claims [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1094.693514] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.693788] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.715702] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1094.720159] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.720800] env[60175]: DEBUG nova.compute.utils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1094.722174] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1094.722341] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1094.722501] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1094.722664] env[60175]: DEBUG nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1094.722819] env[60175]: DEBUG nova.network.neutron [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1094.746719] env[60175]: DEBUG neutronclient.v2_0.client [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1094.748290] env[60175]: ERROR nova.compute.manager [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] result = getattr(controller, method)(*args, **kwargs) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._get(image_id) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] resp, body = self.http_client.get(url, headers=header) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.request(url, 'GET', **kwargs) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._handle_response(resp) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exc.from_response(resp, resp.content) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] During handling of the above exception, another exception occurred: [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.driver.spawn(context, instance, image_meta, [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._fetch_image_if_missing(context, vi) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] images.fetch_image( [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] metadata = IMAGE_API.get(context, image_ref) [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return session.show(context, image_id, [ 1094.748290] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] _reraise_translated_image_exception(image_id) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise new_exc.with_traceback(exc_trace) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] result = getattr(controller, method)(*args, **kwargs) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._get(image_id) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] resp, body = self.http_client.get(url, headers=header) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.request(url, 'GET', **kwargs) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._handle_response(resp) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exc.from_response(resp, resp.content) [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] During handling of the above exception, another exception occurred: [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._build_and_run_instance(context, instance, image, [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] with excutils.save_and_reraise_exception(): [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.force_reraise() [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise self.value [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] with self.rt.instance_claim(context, instance, node, allocs, [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.abort() [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1094.749214] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return f(*args, **kwargs) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._unset_instance_host_and_node(instance) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] instance.save() [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] updates, result = self.indirection_api.object_action( [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] result = self.transport._send( [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._driver.send(target, ctxt, message, [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise result [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] nova.exception_Remote.InstanceNotFound_Remote: Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a could not be found. [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return getattr(target, method)(*args, **kwargs) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return fn(self, *args, **kwargs) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return f(*args, **kwargs) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] with excutils.save_and_reraise_exception() as ectxt: [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.force_reraise() [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise self.value [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return f(*args, **kwargs) [ 1094.750363] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return f(context, *args, **kwargs) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exception.InstanceNotFound(instance_id=uuid) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] nova.exception.InstanceNotFound: Instance 500d78f9-ee0c-4620-9936-1a9b4f4fc09a could not be found. [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] During handling of the above exception, another exception occurred: [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] exception_handler_v20(status_code, error_body) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise client_exc(message=error_message, [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Neutron server returns request_ids: ['req-1226fc22-e52f-4cbe-a34a-5680adcd2724'] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] During handling of the above exception, another exception occurred: [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Traceback (most recent call last): [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._deallocate_network(context, instance, requested_networks) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self.network_api.deallocate_for_instance( [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] data = neutron.list_ports(**search_opts) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.list('ports', self.ports_path, retrieve_all, [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1094.751604] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] for r in self._pagination(collection, path, **params): [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] res = self.get(path, params=params) [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.retry_request("GET", action, body=body, [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] return self.do_request(method, action, body=body, [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] ret = obj(*args, **kwargs) [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] self._handle_fault_response(status_code, replybody, resp) [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] raise exception.Unauthorized() [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] nova.exception.Unauthorized: Not authorized. [ 1094.752866] env[60175]: ERROR nova.compute.manager [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] [ 1094.768437] env[60175]: DEBUG oslo_concurrency.lockutils [None req-c0a3ac35-b9c2-4466-a1d5-80f4c90401c3 tempest-ServerDiskConfigTestJSON-962126213 tempest-ServerDiskConfigTestJSON-962126213-project-member] Lock "500d78f9-ee0c-4620-9936-1a9b4f4fc09a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 453.454s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.807709] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1094.808486] env[60175]: ERROR nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Traceback (most recent call last): [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] result = getattr(controller, method)(*args, **kwargs) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self._get(image_id) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] resp, body = self.http_client.get(url, headers=header) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self.request(url, 'GET', **kwargs) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self._handle_response(resp) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] raise exc.from_response(resp, resp.content) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] During handling of the above exception, another exception occurred: [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Traceback (most recent call last): [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] yield resources [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] self.driver.spawn(context, instance, image_meta, [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] self._fetch_image_if_missing(context, vi) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] image_fetch(context, vi, tmp_image_ds_loc) [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] images.fetch_image( [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1094.808486] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] metadata = IMAGE_API.get(context, image_ref) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return session.show(context, image_id, [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] _reraise_translated_image_exception(image_id) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] raise new_exc.with_traceback(exc_trace) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] result = getattr(controller, method)(*args, **kwargs) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self._get(image_id) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] resp, body = self.http_client.get(url, headers=header) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self.request(url, 'GET', **kwargs) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] return self._handle_response(resp) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] raise exc.from_response(resp, resp.content) [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1094.809824] env[60175]: ERROR nova.compute.manager [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] [ 1094.809824] env[60175]: INFO nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Terminating instance [ 1094.810517] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.810517] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1094.810950] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1094.811128] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquired lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1094.811296] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1094.812169] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15c17a14-1a99-4589-81b0-f26190c844db {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.818731] env[60175]: DEBUG nova.compute.utils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Can not refresh info_cache because instance was not found {{(pid=60175) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1094.821891] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1094.822074] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1094.823075] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-418d1f01-7aff-447f-a8d5-d3c9b06d5ce5 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.828090] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Waiting for the task: (returnval){ [ 1094.828090] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ec01a1-f286-711e-d78d-9517281e9a4b" [ 1094.828090] env[60175]: _type = "Task" [ 1094.828090] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.836702] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52ec01a1-f286-711e-d78d-9517281e9a4b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1094.846734] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1094.902446] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1094.912014] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Releasing lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1094.912413] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1094.912604] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1094.914693] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f0d8ffc-3da3-493a-8142-f7d1a432971a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.921666] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1094.921885] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-97b91cd0-e9bf-4abe-9c54-490001a4826c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.955081] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1094.955284] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1094.955455] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Deleting the datastore file [datastore2] 57a5dcae-6861-418a-a041-9cd5b7a43982 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1094.955683] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9a16b3b8-7594-4c22-8092-8b972d791aae {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.961709] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Waiting for the task: (returnval){ [ 1094.961709] env[60175]: value = "task-4292970" [ 1094.961709] env[60175]: _type = "Task" [ 1094.961709] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1094.968609] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Task: {'id': task-4292970, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.338748] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1095.338961] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Creating directory with path [datastore2] vmware_temp/971c7ade-e12e-4ce8-bac1-e16745b41ba7/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1095.339198] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-832223bc-cb9b-445f-9b6a-30bf4154ae8f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.350361] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Created directory with path [datastore2] vmware_temp/971c7ade-e12e-4ce8-bac1-e16745b41ba7/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1095.350538] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Fetch image to [datastore2] vmware_temp/971c7ade-e12e-4ce8-bac1-e16745b41ba7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1095.350705] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/971c7ade-e12e-4ce8-bac1-e16745b41ba7/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1095.351479] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-640ca7f1-87d2-4a77-8efe-080ddd68c475 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.360055] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80dc00d-af15-48db-bb78-ba4acb4fca95 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.367867] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c80ccb-202f-488e-b7f7-cf61ef633b8d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.397116] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7a1da1b-9578-4df2-896c-967087c83b6b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.402304] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-86c7c6e3-8907-402d-8748-4dff79f82992 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.423941] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1095.470903] env[60175]: DEBUG oslo_vmware.api [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Task: {'id': task-4292970, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.031681} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1095.471200] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1095.471390] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1095.471556] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1095.471725] env[60175]: INFO nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1095.471952] env[60175]: DEBUG oslo.service.loopingcall [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1095.472159] env[60175]: DEBUG nova.compute.manager [-] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Skipping network deallocation for instance since networking was not requested. {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1095.475586] env[60175]: DEBUG nova.compute.claims [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1095.475752] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1095.476126] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1095.499188] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1095.499896] env[60175]: DEBUG nova.compute.utils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance 57a5dcae-6861-418a-a041-9cd5b7a43982 could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1095.501406] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1095.501533] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1095.501748] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquiring lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1095.501891] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Acquired lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.502056] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1095.509320] env[60175]: DEBUG nova.compute.utils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Can not refresh info_cache because instance was not found {{(pid=60175) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1095.520159] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.520889] env[60175]: ERROR nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] result = getattr(controller, method)(*args, **kwargs) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._get(image_id) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] resp, body = self.http_client.get(url, headers=header) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.request(url, 'GET', **kwargs) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._handle_response(resp) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exc.from_response(resp, resp.content) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] During handling of the above exception, another exception occurred: [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] yield resources [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.driver.spawn(context, instance, image_meta, [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._fetch_image_if_missing(context, vi) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image_fetch(context, vi, tmp_image_ds_loc) [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] images.fetch_image( [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1095.520889] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] metadata = IMAGE_API.get(context, image_ref) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return session.show(context, image_id, [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] _reraise_translated_image_exception(image_id) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise new_exc.with_traceback(exc_trace) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] result = getattr(controller, method)(*args, **kwargs) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._get(image_id) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] resp, body = self.http_client.get(url, headers=header) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.request(url, 'GET', **kwargs) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._handle_response(resp) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exc.from_response(resp, resp.content) [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1095.522185] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1095.522185] env[60175]: INFO nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Terminating instance [ 1095.523116] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.523116] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1095.523317] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1095.523499] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1095.523729] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-03c0aa5e-3eff-46ae-9f5e-36ffffccf68e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.526255] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b65cd6-6741-4c76-9a75-047f6ceb3689 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.529562] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1095.535957] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1095.536195] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-782fecc6-0756-4fdb-a7cd-dbc7f4e784a3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.538375] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1095.538548] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1095.539517] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f0d5ab4-84c0-4c94-b417-381ea5a427dd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.544314] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for the task: (returnval){ [ 1095.544314] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52824ebe-97f2-7595-40ff-ac0614481be4" [ 1095.544314] env[60175]: _type = "Task" [ 1095.544314] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.554069] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52824ebe-97f2-7595-40ff-ac0614481be4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.601803] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1095.601937] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1095.602128] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Deleting the datastore file [datastore2] 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1095.602401] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f09e2f38-2f75-4620-9959-316ab52e3d67 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.609238] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Waiting for the task: (returnval){ [ 1095.609238] env[60175]: value = "task-4292972" [ 1095.609238] env[60175]: _type = "Task" [ 1095.609238] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.616773] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Task: {'id': task-4292972, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1095.623916] env[60175]: DEBUG nova.network.neutron [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.635261] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Releasing lock "refresh_cache-57a5dcae-6861-418a-a041-9cd5b7a43982" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.635471] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1095.635649] env[60175]: DEBUG nova.compute.manager [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Skipping network deallocation for instance since networking was not requested. {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1095.680134] env[60175]: DEBUG oslo_concurrency.lockutils [None req-f22188ee-a399-4795-91f6-06999d745a8f tempest-ServerShowV254Test-1786726166 tempest-ServerShowV254Test-1786726166-project-member] Lock "57a5dcae-6861-418a-a041-9cd5b7a43982" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 451.122s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1096.056926] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1096.057193] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Creating directory with path [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1096.057414] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4607c5a3-bbec-4e32-aac9-241fdebc4a72 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.068366] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Created directory with path [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1096.068538] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Fetch image to [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1096.068700] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1096.069385] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71e06d9f-24d7-4ca6-8329-cdc621565e5f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.075752] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ca03e88-58b6-41c1-8bba-9364dc2b0f2b {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.084257] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7a221ef-688a-4405-9fb9-6f3bd8f7f1fb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.116754] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51642465-c052-4a54-84c1-d875e02730bd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.125089] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61c70de5-f3e6-47d3-9b16-0d1bb0746d31 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.127051] env[60175]: DEBUG oslo_vmware.api [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Task: {'id': task-4292972, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066637} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1096.127051] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1096.127051] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1096.127468] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1096.127468] env[60175]: INFO nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1096.129514] env[60175]: DEBUG nova.compute.claims [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1096.129677] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1096.129877] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1096.145378] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1096.153801] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1096.154486] env[60175]: DEBUG nova.compute.utils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1096.155850] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1096.156031] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1096.156196] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1096.156361] env[60175]: DEBUG nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1096.156519] env[60175]: DEBUG nova.network.neutron [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1096.188944] env[60175]: DEBUG oslo_vmware.rw_handles [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1096.246825] env[60175]: DEBUG oslo_vmware.rw_handles [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1096.247036] env[60175]: DEBUG oslo_vmware.rw_handles [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1096.288686] env[60175]: DEBUG neutronclient.v2_0.client [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1096.290295] env[60175]: ERROR nova.compute.manager [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] result = getattr(controller, method)(*args, **kwargs) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._get(image_id) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] resp, body = self.http_client.get(url, headers=header) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.request(url, 'GET', **kwargs) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._handle_response(resp) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exc.from_response(resp, resp.content) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] During handling of the above exception, another exception occurred: [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.driver.spawn(context, instance, image_meta, [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._fetch_image_if_missing(context, vi) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image_fetch(context, vi, tmp_image_ds_loc) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] images.fetch_image( [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] metadata = IMAGE_API.get(context, image_ref) [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return session.show(context, image_id, [ 1096.290295] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] _reraise_translated_image_exception(image_id) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise new_exc.with_traceback(exc_trace) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] result = getattr(controller, method)(*args, **kwargs) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._get(image_id) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] resp, body = self.http_client.get(url, headers=header) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.request(url, 'GET', **kwargs) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._handle_response(resp) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exc.from_response(resp, resp.content) [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] During handling of the above exception, another exception occurred: [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._build_and_run_instance(context, instance, image, [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] with excutils.save_and_reraise_exception(): [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.force_reraise() [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise self.value [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] with self.rt.instance_claim(context, instance, node, allocs, [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.abort() [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1096.291262] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return f(*args, **kwargs) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._unset_instance_host_and_node(instance) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] instance.save() [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] updates, result = self.indirection_api.object_action( [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] result = self.transport._send( [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._driver.send(target, ctxt, message, [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise result [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] nova.exception_Remote.InstanceNotFound_Remote: Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a could not be found. [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return getattr(target, method)(*args, **kwargs) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return fn(self, *args, **kwargs) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return f(*args, **kwargs) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] with excutils.save_and_reraise_exception() as ectxt: [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.force_reraise() [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise self.value [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return f(*args, **kwargs) [ 1096.293259] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return f(context, *args, **kwargs) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exception.InstanceNotFound(instance_id=uuid) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] nova.exception.InstanceNotFound: Instance 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a could not be found. [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] During handling of the above exception, another exception occurred: [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] exception_handler_v20(status_code, error_body) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise client_exc(message=error_message, [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Neutron server returns request_ids: ['req-a7c66971-9e44-4d34-bc9b-5f055c3335d3'] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] During handling of the above exception, another exception occurred: [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Traceback (most recent call last): [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._deallocate_network(context, instance, requested_networks) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self.network_api.deallocate_for_instance( [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] data = neutron.list_ports(**search_opts) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.list('ports', self.ports_path, retrieve_all, [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1096.294541] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] for r in self._pagination(collection, path, **params): [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] res = self.get(path, params=params) [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.retry_request("GET", action, body=body, [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] return self.do_request(method, action, body=body, [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] ret = obj(*args, **kwargs) [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] self._handle_fault_response(status_code, replybody, resp) [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] raise exception.Unauthorized() [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] nova.exception.Unauthorized: Not authorized. [ 1096.295696] env[60175]: ERROR nova.compute.manager [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] [ 1096.314361] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6a5e0c0d-7362-4ab2-8e30-d5bda24e2e5e tempest-ServerRescueTestJSON-247798492 tempest-ServerRescueTestJSON-247798492-project-member] Lock "53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 440.256s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1097.079279] env[60175]: DEBUG nova.compute.manager [req-cbf9d8b4-40c8-4b49-9729-b7a2f6591070 req-884943f4-6b93-48a1-ad50-afe1b58cf3cb service nova] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Received event network-vif-deleted-f87176c0-1159-456d-9ea8-4227763025cd {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1106.952043] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1106.952043] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Cleaning up deleted instances with incomplete migration {{(pid=60175) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1107.959826] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.969737] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1107.969966] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1107.970130] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1107.970293] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1107.971387] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4eafa06-3b7b-4284-92a9-f2e6fbd2d6c8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.980329] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01ff61f1-c4ab-494c-9ca4-b8c306fad4c9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.995228] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a09f389-06f8-42dd-9787-fba799cd0a72 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.001987] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d942430-e6db-432b-b8d8-3fea2efdf699 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.034312] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180735MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1108.034455] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1108.034744] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1108.130525] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1108.130736] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1108.130882] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=149GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1108.156270] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-576aaf2b-2212-426a-9d48-11e9ec06dac4 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.163382] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f1442b-32c0-4948-90ff-7558f5b58430 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.194290] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-014bb6ba-8a63-4109-a7f2-d8305cecea0a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.201165] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e587458d-17ad-45c5-b797-86873e713972 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.213873] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.221830] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.234420] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1108.234672] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1110.225898] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1110.226322] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1110.226322] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1110.236668] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1110.236807] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1110.949994] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1110.950273] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1111.957544] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1112.945799] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1112.949487] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1112.949487] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1113.950622] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.950728] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.951193] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.951193] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Cleaning up deleted instances {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1115.976371] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] There are 9 instances to clean {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1115.976948] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.029895] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.068703] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.104413] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 53eeb8f0-f7c6-41bf-8d7e-2e15fc22d42a] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.126327] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 57a5dcae-6861-418a-a041-9cd5b7a43982] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.147030] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 39a2035c-bb7b-4837-b556-e8bb38ffb514] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.167086] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 500d78f9-ee0c-4620-9936-1a9b4f4fc09a] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.190047] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 068814dd-328c-48d1-b514-34eb43b0f2b1] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1116.210586] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 029d2099-2e55-4632-81b6-b59d6a20faab] Instance has had 0 of 5 cleanup attempts {{(pid=60175) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.228085] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.946709] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1133.157930] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1133.170012] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Getting list of instances from cluster (obj){ [ 1133.170012] env[60175]: value = "domain-c8" [ 1133.170012] env[60175]: _type = "ClusterComputeResource" [ 1133.170012] env[60175]: } {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1133.171047] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a7c771-fae4-4fa3-99b4-7502280fe6b1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.183512] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Got total of 4 instances {{(pid=60175) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1133.183673] env[60175]: WARNING nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] While synchronizing instance power states, found 1 instances in the database and 4 instances on the hypervisor. [ 1133.183808] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Triggering sync for uuid 67cfe7ba-4590-451b-9e1a-340977b597a4 {{(pid=60175) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1133.184161] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "67cfe7ba-4590-451b-9e1a-340977b597a4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.226729] env[60175]: WARNING oslo_vmware.rw_handles [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.226729] env[60175]: ERROR oslo_vmware.rw_handles [ 1146.227462] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1146.229233] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1146.229466] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Copying Virtual Disk [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/67c101a9-f6ee-48ab-bb9b-cc3ba4990406/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1146.229806] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cecb8b7c-d210-4290-8e5c-4b511962365a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.237508] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for the task: (returnval){ [ 1146.237508] env[60175]: value = "task-4292973" [ 1146.237508] env[60175]: _type = "Task" [ 1146.237508] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.245421] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Task: {'id': task-4292973, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.748058] env[60175]: DEBUG oslo_vmware.exceptions [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1146.748312] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1146.748905] env[60175]: ERROR nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1146.748905] env[60175]: Faults: ['InvalidArgument'] [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Traceback (most recent call last): [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] yield resources [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self.driver.spawn(context, instance, image_meta, [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self._fetch_image_if_missing(context, vi) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] image_cache(vi, tmp_image_ds_loc) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] vm_util.copy_virtual_disk( [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] session._wait_for_task(vmdk_copy_task) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return self.wait_for_task(task_ref) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return evt.wait() [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] result = hub.switch() [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return self.greenlet.switch() [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self.f(*self.args, **self.kw) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] raise exceptions.translate_fault(task_info.error) [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Faults: ['InvalidArgument'] [ 1146.748905] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] [ 1146.750023] env[60175]: INFO nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Terminating instance [ 1146.750799] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1146.751008] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.751257] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83e35abc-2f44-4a8c-8946-5c69d52c6d60 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.753463] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1146.753657] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1146.754359] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea3be13b-aebe-4979-8219-aed750f64f31 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.761276] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1146.761480] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-15f62fab-edf6-4f42-979c-ef59f12e7245 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.763710] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.763812] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1146.764677] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be8e89ee-97bf-47d9-834c-0612bed1d90d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.769300] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Waiting for the task: (returnval){ [ 1146.769300] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526283aa-ae94-126b-1000-e571affb8a9d" [ 1146.769300] env[60175]: _type = "Task" [ 1146.769300] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.776346] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]526283aa-ae94-126b-1000-e571affb8a9d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.843929] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1146.844166] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1146.844343] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Deleting the datastore file [datastore2] 67cfe7ba-4590-451b-9e1a-340977b597a4 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1146.844604] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-efdbb753-10e7-47bd-b5bc-6b747f540747 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.850841] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for the task: (returnval){ [ 1146.850841] env[60175]: value = "task-4292975" [ 1146.850841] env[60175]: _type = "Task" [ 1146.850841] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.859999] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Task: {'id': task-4292975, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.279660] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.280075] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Creating directory with path [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.280141] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4422654d-6ab3-49a4-86d5-006d181c5309 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.290944] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Created directory with path [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.291137] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Fetch image to [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.291335] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.292029] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e510d80d-fbed-4621-a514-62fbffd284dd {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.298568] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d034982-7a55-4505-bdfb-67e0ff7d599c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.307298] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99bce9c3-573b-46af-ad44-956e518caf10 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.337331] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df7c114d-079c-41d5-b3d2-224eea0150a8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.343345] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ed34b404-a514-4100-b36a-a3843891a851 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.358653] env[60175]: DEBUG oslo_vmware.api [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Task: {'id': task-4292975, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063818} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1147.358923] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1147.359119] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1147.359293] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1147.359465] env[60175]: INFO nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1147.361677] env[60175]: DEBUG nova.compute.claims [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1147.361839] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.362076] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.366051] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.411243] env[60175]: DEBUG oslo_vmware.rw_handles [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1147.465408] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Refreshing inventories for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1147.471153] env[60175]: DEBUG oslo_vmware.rw_handles [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1147.471153] env[60175]: DEBUG oslo_vmware.rw_handles [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1147.480350] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Updating ProviderTree inventory for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1147.481061] env[60175]: DEBUG nova.compute.provider_tree [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Updating inventory in ProviderTree for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1147.492599] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Refreshing aggregate associations for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e, aggregates: None {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1147.507461] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Refreshing trait associations for resource provider 3984c8da-53ad-4889-8d1f-23bab60fa84e, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60175) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1147.530419] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f50dbed-42b9-4a05-a396-c211f84e8a84 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.537448] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c55cc707-a3e7-4027-ac0e-c2785a8ae3ea {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.566789] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd4d1b3b-6b47-4ed1-b3da-cdd4854a5e26 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.573182] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ebe73b2-23b3-40df-a003-d8f9c2705359 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.585846] env[60175]: DEBUG nova.compute.provider_tree [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1147.594969] env[60175]: DEBUG nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1147.607378] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.607869] env[60175]: ERROR nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.607869] env[60175]: Faults: ['InvalidArgument'] [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Traceback (most recent call last): [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self.driver.spawn(context, instance, image_meta, [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self._fetch_image_if_missing(context, vi) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] image_cache(vi, tmp_image_ds_loc) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] vm_util.copy_virtual_disk( [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] session._wait_for_task(vmdk_copy_task) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return self.wait_for_task(task_ref) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return evt.wait() [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] result = hub.switch() [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] return self.greenlet.switch() [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] self.f(*self.args, **self.kw) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] raise exceptions.translate_fault(task_info.error) [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Faults: ['InvalidArgument'] [ 1147.607869] env[60175]: ERROR nova.compute.manager [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] [ 1147.608606] env[60175]: DEBUG nova.compute.utils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1147.609804] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Build of instance 67cfe7ba-4590-451b-9e1a-340977b597a4 was re-scheduled: A specified parameter was not correct: fileType [ 1147.609804] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1147.610183] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1147.610349] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1147.610545] env[60175]: DEBUG nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1147.610704] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1147.882328] env[60175]: DEBUG nova.network.neutron [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1147.894028] env[60175]: INFO nova.compute.manager [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Took 0.28 seconds to deallocate network for instance. [ 1147.975823] env[60175]: INFO nova.scheduler.client.report [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Deleted allocations for instance 67cfe7ba-4590-451b-9e1a-340977b597a4 [ 1147.991859] env[60175]: DEBUG oslo_concurrency.lockutils [None req-46f39ab2-9efc-4b9a-a59d-2e6cb5c1b205 tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 339.147s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.992134] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 142.320s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.992359] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Acquiring lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.992564] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.992765] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.994764] env[60175]: INFO nova.compute.manager [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Terminating instance [ 1147.996435] env[60175]: DEBUG nova.compute.manager [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.996630] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.997084] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6c40758b-9435-4845-9a42-b99c6141b69f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.009292] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8d7305d-2140-4314-bbe2-452d60a883d9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.033866] env[60175]: WARNING nova.virt.vmwareapi.vmops [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67cfe7ba-4590-451b-9e1a-340977b597a4 could not be found. [ 1148.034089] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.034270] env[60175]: INFO nova.compute.manager [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1148.034506] env[60175]: DEBUG oslo.service.loopingcall [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1148.034706] env[60175]: DEBUG nova.compute.manager [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.034801] env[60175]: DEBUG nova.network.neutron [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.059998] env[60175]: DEBUG nova.network.neutron [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.067911] env[60175]: INFO nova.compute.manager [-] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] Took 0.03 seconds to deallocate network for instance. [ 1148.146258] env[60175]: DEBUG oslo_concurrency.lockutils [None req-6dba1448-489b-4366-bfbc-b92ab021a3dc tempest-ImagesOneServerTestJSON-50665435 tempest-ImagesOneServerTestJSON-50665435-project-member] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.147507] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 14.963s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.147726] env[60175]: INFO nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: 67cfe7ba-4590-451b-9e1a-340977b597a4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1148.147903] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "67cfe7ba-4590-451b-9e1a-340977b597a4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.542498] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "d2ff993d-35d8-479c-bb3e-2c06080896d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.542837] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "d2ff993d-35d8-479c-bb3e-2c06080896d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.555171] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Starting instance... {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1153.600554] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1153.600795] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1153.602336] env[60175]: INFO nova.compute.claims [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1153.673048] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-666919bc-aaac-4646-9639-adb191207146 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.679945] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14140e60-bfed-4622-b428-ea79af6c09c1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.709102] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b759657-26c1-4fb1-8eb6-02b68b2b3637 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.716150] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4d93d13-c9dc-493c-8b9b-abace094d97a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.729788] env[60175]: DEBUG nova.compute.provider_tree [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1153.738261] env[60175]: DEBUG nova.scheduler.client.report [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1153.750948] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1153.751499] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Start building networks asynchronously for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1153.781640] env[60175]: DEBUG nova.compute.utils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Using /dev/sd instead of None {{(pid=60175) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1153.783057] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Allocating IP information in the background. {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1153.783243] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] allocate_for_instance() {{(pid=60175) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1153.791910] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Start building block device mappings for instance. {{(pid=60175) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1153.837118] env[60175]: DEBUG nova.policy [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df01dd84e1384c44a40bc475ee636bbc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '358c9351af584a2b96e16a72d45da8da', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60175) authorize /opt/stack/nova/nova/policy.py:203}} [ 1153.854019] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Start spawning the instance on the hypervisor. {{(pid=60175) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1153.879802] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-06-20T13:46:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-06-20T13:46:42Z,direct_url=,disk_format='vmdk',id=ab7fcb5a-745a-4c08-9c04-49b187178f83,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='615f5638ac394d9090feb5ebdacc55aa',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-06-20T13:46:43Z,virtual_size=,visibility=), allow threads: False {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1153.879935] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Flavor limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1153.880062] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Image limits 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1153.880251] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Flavor pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1153.880394] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Image pref 0:0:0 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1153.880535] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60175) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1153.880772] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1153.880926] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1153.881217] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Got 1 possible topologies {{(pid=60175) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1153.881410] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1153.881641] env[60175]: DEBUG nova.virt.hardware [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60175) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1153.882543] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f918cae-b84d-45fe-9b22-c93f4eb415cc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1153.890393] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09419aa0-d1b1-4c63-b091-70b874b130db {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1154.123237] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Successfully created port: d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1154.605981] env[60175]: DEBUG nova.compute.manager [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Received event network-vif-plugged-d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1154.606315] env[60175]: DEBUG oslo_concurrency.lockutils [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] Acquiring lock "d2ff993d-35d8-479c-bb3e-2c06080896d0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1154.606448] env[60175]: DEBUG oslo_concurrency.lockutils [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] Lock "d2ff993d-35d8-479c-bb3e-2c06080896d0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1154.606594] env[60175]: DEBUG oslo_concurrency.lockutils [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] Lock "d2ff993d-35d8-479c-bb3e-2c06080896d0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1154.606758] env[60175]: DEBUG nova.compute.manager [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] No waiting events found dispatching network-vif-plugged-d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1154.606915] env[60175]: WARNING nova.compute.manager [req-3b2d31ec-fd89-498b-a7e7-9f4841749d56 req-c7f9b7d3-bbf1-41d0-bab8-679ff8aa24d9 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Received unexpected event network-vif-plugged-d689946a-6509-4fe7-9f16-0cb2eaa5d478 for instance with vm_state building and task_state spawning. [ 1154.679162] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Successfully updated port: d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1154.688404] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1154.688540] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquired lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1154.688690] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Building network info cache for instance {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1154.723095] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance cache missing network info. {{(pid=60175) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1154.873292] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Updating instance_info_cache with network_info: [{"id": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "address": "fa:16:3e:1d:11:f8", "network": {"id": "83e08a1b-17a3-451c-94d0-4578b74abb48", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1415169631-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "358c9351af584a2b96e16a72d45da8da", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd689946a-65", "ovs_interfaceid": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1154.884200] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Releasing lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1154.884483] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance network_info: |[{"id": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "address": "fa:16:3e:1d:11:f8", "network": {"id": "83e08a1b-17a3-451c-94d0-4578b74abb48", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1415169631-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "358c9351af584a2b96e16a72d45da8da", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd689946a-65", "ovs_interfaceid": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60175) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1154.884877] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:11:f8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '60567ee6-01d0-4b16-9c7a-4a896827d6eb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd689946a-6509-4fe7-9f16-0cb2eaa5d478', 'vif_model': 'vmxnet3'}] {{(pid=60175) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1154.892298] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Creating folder: Project (358c9351af584a2b96e16a72d45da8da). Parent ref: group-v845475. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1154.892769] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d021f4fc-3510-4f3e-8249-fc84c1b27130 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1154.904271] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Created folder: Project (358c9351af584a2b96e16a72d45da8da) in parent group-v845475. [ 1154.904454] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Creating folder: Instances. Parent ref: group-v845544. {{(pid=60175) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1154.904671] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-792d7df4-981b-4257-9e7a-bc9464602a7e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1154.912477] env[60175]: INFO nova.virt.vmwareapi.vm_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Created folder: Instances in parent group-v845544. [ 1154.912689] env[60175]: DEBUG oslo.service.loopingcall [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60175) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1154.912856] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Creating VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1154.913073] env[60175]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-68a1cdf2-5833-4ea7-bf52-12d9cec1298a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1154.931222] env[60175]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1154.931222] env[60175]: value = "task-4292978" [ 1154.931222] env[60175]: _type = "Task" [ 1154.931222] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1154.938452] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292978, 'name': CreateVM_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1155.442383] env[60175]: DEBUG oslo_vmware.api [-] Task: {'id': task-4292978, 'name': CreateVM_Task, 'duration_secs': 0.275653} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1155.442551] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Created VM on the ESX host {{(pid=60175) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1155.443201] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1155.443367] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1155.443703] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1155.443937] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-54c9d728-ef20-4093-848c-e90b12653318 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.448009] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Waiting for the task: (returnval){ [ 1155.448009] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52efc212-1fc7-d2b4-7809-13101537c4bb" [ 1155.448009] env[60175]: _type = "Task" [ 1155.448009] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1155.454983] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52efc212-1fc7-d2b4-7809-13101537c4bb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1155.958248] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1155.958608] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Processing image ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1155.958831] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1156.630176] env[60175]: DEBUG nova.compute.manager [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Received event network-changed-d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1156.630389] env[60175]: DEBUG nova.compute.manager [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Refreshing instance network info cache due to event network-changed-d689946a-6509-4fe7-9f16-0cb2eaa5d478. {{(pid=60175) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1156.630603] env[60175]: DEBUG oslo_concurrency.lockutils [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] Acquiring lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1156.630745] env[60175]: DEBUG oslo_concurrency.lockutils [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] Acquired lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1156.630902] env[60175]: DEBUG nova.network.neutron [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Refreshing network info cache for port d689946a-6509-4fe7-9f16-0cb2eaa5d478 {{(pid=60175) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1156.880242] env[60175]: DEBUG nova.network.neutron [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Updated VIF entry in instance network info cache for port d689946a-6509-4fe7-9f16-0cb2eaa5d478. {{(pid=60175) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1156.880628] env[60175]: DEBUG nova.network.neutron [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Updating instance_info_cache with network_info: [{"id": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "address": "fa:16:3e:1d:11:f8", "network": {"id": "83e08a1b-17a3-451c-94d0-4578b74abb48", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1415169631-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "358c9351af584a2b96e16a72d45da8da", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "60567ee6-01d0-4b16-9c7a-4a896827d6eb", "external-id": "nsx-vlan-transportzone-28", "segmentation_id": 28, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd689946a-65", "ovs_interfaceid": "d689946a-6509-4fe7-9f16-0cb2eaa5d478", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1156.889666] env[60175]: DEBUG oslo_concurrency.lockutils [req-7202376b-7e69-4776-8748-3fbaed10bed1 req-a0edf66c-bffc-4d2d-9200-f797d00aa9a4 service nova] Releasing lock "refresh_cache-d2ff993d-35d8-479c-bb3e-2c06080896d0" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1169.951045] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1169.951045] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1169.951045] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1169.961160] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1169.961334] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1169.961567] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1169.972122] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1169.972328] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1169.972494] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1169.972675] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1169.973710] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2361180a-741b-4666-a2f1-1931501a3f91 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1169.981713] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87e8e558-83f2-4253-889a-035b6eebbf02 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1169.995874] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a78d52f-b205-411e-a5c2-9c6e1fa11a80 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.001960] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cefae24b-724c-43f0-a88a-5242412c8bb1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.032344] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180728MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1170.032572] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1170.032865] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1170.068708] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance d2ff993d-35d8-479c-bb3e-2c06080896d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1170.068896] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1170.069050] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=149GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1170.092534] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5142c7c-51e9-4292-b7fa-de618ec66fb0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.099218] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7e986b8-e61e-439b-8956-4fd7b0b77e57 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.127916] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-410d4513-e136-4974-bc82-7c4d8bd11ae0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.134355] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72bd26bb-8099-4882-9f31-3792487402fe {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.146893] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1170.154439] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1170.166746] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1170.166920] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1171.156074] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1171.950915] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.945354] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.950056] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.950056] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1175.951763] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1175.952138] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1176.950641] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1196.732054] env[60175]: WARNING oslo_vmware.rw_handles [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1196.732054] env[60175]: ERROR oslo_vmware.rw_handles [ 1196.732054] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1196.734107] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1196.734542] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Copying Virtual Disk [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/23f2e07c-7404-4ea5-94e5-3e98acbeb8fe/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1196.734981] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-58ea3f43-8672-4bbc-b29a-b0a86b2d7ba8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.743095] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Waiting for the task: (returnval){ [ 1196.743095] env[60175]: value = "task-4292979" [ 1196.743095] env[60175]: _type = "Task" [ 1196.743095] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1196.751101] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Task: {'id': task-4292979, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.253067] env[60175]: DEBUG oslo_vmware.exceptions [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1197.253308] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1197.253831] env[60175]: ERROR nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1197.253831] env[60175]: Faults: ['InvalidArgument'] [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Traceback (most recent call last): [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] yield resources [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] self.driver.spawn(context, instance, image_meta, [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] self._fetch_image_if_missing(context, vi) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] image_cache(vi, tmp_image_ds_loc) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] vm_util.copy_virtual_disk( [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] session._wait_for_task(vmdk_copy_task) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] return self.wait_for_task(task_ref) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] return evt.wait() [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] result = hub.switch() [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] return self.greenlet.switch() [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] self.f(*self.args, **self.kw) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] raise exceptions.translate_fault(task_info.error) [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Faults: ['InvalidArgument'] [ 1197.253831] env[60175]: ERROR nova.compute.manager [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] [ 1197.254991] env[60175]: INFO nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Terminating instance [ 1197.255661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1197.255860] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1197.256101] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b338e534-3912-4baa-be0f-4f082c855fa1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.258222] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1197.258415] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1197.259137] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a009ac-266e-4278-a037-a6835f4cd1fa {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.266262] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1197.266262] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-188cdb12-fb18-47e4-b940-249c8818f963 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.268012] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1197.268251] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1197.269188] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3a5c81ea-79d9-4c42-a95b-3b78d479d460 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.273937] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Waiting for the task: (returnval){ [ 1197.273937] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]527acc41-226f-51eb-feb3-892c5b450511" [ 1197.273937] env[60175]: _type = "Task" [ 1197.273937] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1197.288125] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1197.288347] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Creating directory with path [datastore2] vmware_temp/f69b30b8-3f11-4add-b321-ff31949c7444/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1197.288548] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c3030a87-72c0-4f9c-9157-c494d0cfd96d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.300079] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Created directory with path [datastore2] vmware_temp/f69b30b8-3f11-4add-b321-ff31949c7444/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1197.300266] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Fetch image to [datastore2] vmware_temp/f69b30b8-3f11-4add-b321-ff31949c7444/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1197.300431] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/f69b30b8-3f11-4add-b321-ff31949c7444/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1197.301167] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06c08890-4ea1-4ffb-a0ff-4b042a0b89ba {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.307480] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d39fb98f-9041-40e6-b480-02f5f877cc69 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.316467] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6a97f76-0bc3-4c73-87f8-46c5ab0ff50d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.345976] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e127e2a-c32d-408f-b3ae-ff6eedae8d42 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.351805] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9e2b0641-c2d8-4ef0-bab7-88f91a3791a9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.368533] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1197.368754] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1197.368930] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Deleting the datastore file [datastore2] e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1197.369199] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26bf7548-413f-450d-9797-6d673dc58888 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.372657] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1197.379706] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Waiting for the task: (returnval){ [ 1197.379706] env[60175]: value = "task-4292981" [ 1197.379706] env[60175]: _type = "Task" [ 1197.379706] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1197.387409] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Task: {'id': task-4292981, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.502894] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1197.503704] env[60175]: ERROR nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] result = getattr(controller, method)(*args, **kwargs) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._get(image_id) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] resp, body = self.http_client.get(url, headers=header) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.request(url, 'GET', **kwargs) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._handle_response(resp) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exc.from_response(resp, resp.content) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] During handling of the above exception, another exception occurred: [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] yield resources [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.driver.spawn(context, instance, image_meta, [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._fetch_image_if_missing(context, vi) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image_fetch(context, vi, tmp_image_ds_loc) [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] images.fetch_image( [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1197.503704] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] metadata = IMAGE_API.get(context, image_ref) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return session.show(context, image_id, [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] _reraise_translated_image_exception(image_id) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise new_exc.with_traceback(exc_trace) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] result = getattr(controller, method)(*args, **kwargs) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._get(image_id) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] resp, body = self.http_client.get(url, headers=header) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.request(url, 'GET', **kwargs) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._handle_response(resp) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exc.from_response(resp, resp.content) [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1197.504944] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1197.504944] env[60175]: INFO nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Terminating instance [ 1197.505661] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1197.505801] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1197.506440] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1197.506624] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1197.506842] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-25251bdb-8c81-48b3-a5a7-5bc68ec9d0d6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.509529] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0201db4-5827-40b1-a508-a0ec6aaf3a46 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.516391] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1197.516623] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5abcf060-30e3-44c4-9c2e-a9f0e4364a1a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.519305] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1197.519473] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1197.520126] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce1a3ddf-da01-463c-8aae-4e1ab23d3d1f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.524818] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 1197.524818] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52164bc9-d956-fd1d-2ccd-cbc9bfab778b" [ 1197.524818] env[60175]: _type = "Task" [ 1197.524818] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1197.531566] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]52164bc9-d956-fd1d-2ccd-cbc9bfab778b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.573978] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1197.574202] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1197.574376] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Deleting the datastore file [datastore2] a45c150e-942b-454a-ab59-aa6b191bfada {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1197.574615] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0ac1763b-3901-4bd5-a2d2-210bcfa09b58 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.580757] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Waiting for the task: (returnval){ [ 1197.580757] env[60175]: value = "task-4292983" [ 1197.580757] env[60175]: _type = "Task" [ 1197.580757] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1197.588112] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Task: {'id': task-4292983, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.889353] env[60175]: DEBUG oslo_vmware.api [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Task: {'id': task-4292981, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.09598} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1197.889796] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1197.889796] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1197.889881] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1197.890031] env[60175]: INFO nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1197.892095] env[60175]: DEBUG nova.compute.claims [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1197.892266] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1197.892471] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1197.916605] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1197.917311] env[60175]: DEBUG nova.compute.utils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1197.918703] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1197.918869] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1197.919043] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1197.919204] env[60175]: DEBUG nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1197.919358] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1197.941978] env[60175]: DEBUG nova.network.neutron [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1197.949687] env[60175]: INFO nova.compute.manager [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Took 0.03 seconds to deallocate network for instance. [ 1197.988371] env[60175]: DEBUG oslo_concurrency.lockutils [None req-5b5abb27-a242-4d36-8ab7-d2b606972bbf tempest-ServersTestMultiNic-406465050 tempest-ServersTestMultiNic-406465050-project-member] Lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 324.741s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1197.988574] env[60175]: DEBUG oslo_concurrency.lockutils [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] Acquired lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1197.989410] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2d83e40-9ba1-4a3e-a0f0-83d7607b498d {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.997365] env[60175]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1197.997509] env[60175]: DEBUG oslo_vmware.api [-] Fault list: [ManagedObjectNotFound] {{(pid=60175) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1197.997808] env[60175]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4e28918a-4e7a-4303-ac20-60a0c68ba0dc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.005103] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10ac9e2f-7d90-4f4e-b035-fb33e99773eb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.027933] env[60175]: ERROR root [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] Original exception being dropped: ['Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 377, in request_handler\n response = request(managed_object, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 586, in __call__\n return client.invoke(args, kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 728, in invoke\n result = self.send(soapenv, timeout=timeout)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 777, in send\n return self.process_reply(reply.message, None, None)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 840, in process_reply\n raise WebFault(fault, replyroot)\n', "suds.WebFault: Server raised fault: 'The object 'vim.VirtualMachine:vm-845539' has already been deleted or has not been completely created'\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 301, in _invoke_api\n return api_method(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 480, in get_object_property\n props = get_object_properties(vim, moref, [property_name],\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 360, in get_object_properties\n retrieve_result = vim.RetrievePropertiesEx(\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 413, in request_handler\n raise exceptions.VimFaultException(fault_list, fault_string,\n', "oslo_vmware.exceptions.VimFaultException: The object 'vim.VirtualMachine:vm-845539' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-845539' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-845539'}\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 123, in _call_method\n return self.invoke_api(module, method, self.vim, *args,\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 358, in invoke_api\n return _invoke_api(module, method, *args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 122, in func\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 122, in _inner\n idle = self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 96, in _func\n result = f(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 341, in _invoke_api\n raise clazz(str(excep),\n', "oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-845539' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-845539' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-845539'}\n"]: nova.exception.InstanceNotFound: Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 could not be found. [ 1198.028133] env[60175]: DEBUG oslo_concurrency.lockutils [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] Releasing lock "e2c5328d-ba5a-4348-8a3f-2a9f745e8f08" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1198.028325] env[60175]: DEBUG nova.compute.manager [req-2e2467cc-e098-4278-9de4-50721b01ca81 req-9a758f04-ff84-4f1e-9181-b1b6a8b07a7d service nova] [instance: e2c5328d-ba5a-4348-8a3f-2a9f745e8f08] Detach interface failed, port_id=dd85cf77-b1fd-4be9-8b53-49cd7b671dfd, reason: Instance e2c5328d-ba5a-4348-8a3f-2a9f745e8f08 could not be found. {{(pid=60175) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10838}} [ 1198.035519] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1198.035744] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Creating directory with path [datastore2] vmware_temp/03c8e66e-a3b4-49e6-88b9-d17f1073c2e6/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1198.035939] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d0ad210d-59d3-4419-81e4-49a6c8f77464 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.047374] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Created directory with path [datastore2] vmware_temp/03c8e66e-a3b4-49e6-88b9-d17f1073c2e6/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1198.047553] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Fetch image to [datastore2] vmware_temp/03c8e66e-a3b4-49e6-88b9-d17f1073c2e6/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1198.047714] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/03c8e66e-a3b4-49e6-88b9-d17f1073c2e6/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1198.048390] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cee8827-5330-4f68-bf68-72de183b23c8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.054307] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49925fbd-a7b3-477f-a0a5-93213b50244e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.062923] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68dfe45c-1ee0-4290-ba75-a2c3561bab8f {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.097371] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813c0d7b-e68e-4232-9a46-37f2a8ef332a {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.104312] env[60175]: DEBUG oslo_vmware.api [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Task: {'id': task-4292983, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075278} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1198.105772] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1198.105962] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1198.106158] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1198.106343] env[60175]: INFO nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1198.108153] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3bd843a2-ca7d-4bfe-99c3-64374a4ede0e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.110093] env[60175]: DEBUG nova.compute.claims [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1198.110274] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1198.110479] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1198.130470] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1198.134270] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1198.134885] env[60175]: DEBUG nova.compute.utils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance a45c150e-942b-454a-ab59-aa6b191bfada could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1198.136328] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1198.136495] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1198.136655] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1198.136811] env[60175]: DEBUG nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1198.136961] env[60175]: DEBUG nova.network.neutron [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1198.163458] env[60175]: DEBUG neutronclient.v2_0.client [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1198.165035] env[60175]: ERROR nova.compute.manager [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] result = getattr(controller, method)(*args, **kwargs) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._get(image_id) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] resp, body = self.http_client.get(url, headers=header) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.request(url, 'GET', **kwargs) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._handle_response(resp) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exc.from_response(resp, resp.content) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] During handling of the above exception, another exception occurred: [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.driver.spawn(context, instance, image_meta, [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._fetch_image_if_missing(context, vi) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image_fetch(context, vi, tmp_image_ds_loc) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] images.fetch_image( [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] metadata = IMAGE_API.get(context, image_ref) [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return session.show(context, image_id, [ 1198.165035] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] _reraise_translated_image_exception(image_id) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise new_exc.with_traceback(exc_trace) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] result = getattr(controller, method)(*args, **kwargs) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._get(image_id) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] resp, body = self.http_client.get(url, headers=header) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.request(url, 'GET', **kwargs) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._handle_response(resp) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exc.from_response(resp, resp.content) [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] During handling of the above exception, another exception occurred: [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._build_and_run_instance(context, instance, image, [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] with excutils.save_and_reraise_exception(): [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.force_reraise() [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise self.value [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] with self.rt.instance_claim(context, instance, node, allocs, [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.abort() [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1198.166228] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return f(*args, **kwargs) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._unset_instance_host_and_node(instance) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] instance.save() [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] updates, result = self.indirection_api.object_action( [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return cctxt.call(context, 'object_action', objinst=objinst, [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] result = self.transport._send( [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._driver.send(target, ctxt, message, [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise result [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] nova.exception_Remote.InstanceNotFound_Remote: Instance a45c150e-942b-454a-ab59-aa6b191bfada could not be found. [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return getattr(target, method)(*args, **kwargs) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return fn(self, *args, **kwargs) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] old_ref, inst_ref = db.instance_update_and_get_original( [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return f(*args, **kwargs) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] with excutils.save_and_reraise_exception() as ectxt: [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.force_reraise() [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise self.value [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return f(*args, **kwargs) [ 1198.167380] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return f(context, *args, **kwargs) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exception.InstanceNotFound(instance_id=uuid) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] nova.exception.InstanceNotFound: Instance a45c150e-942b-454a-ab59-aa6b191bfada could not be found. [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] During handling of the above exception, another exception occurred: [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] exception_handler_v20(status_code, error_body) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise client_exc(message=error_message, [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Neutron server returns request_ids: ['req-0c995065-7d88-4075-923c-f26343e0a1a4'] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] During handling of the above exception, another exception occurred: [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] Traceback (most recent call last): [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._deallocate_network(context, instance, requested_networks) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self.network_api.deallocate_for_instance( [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] data = neutron.list_ports(**search_opts) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.list('ports', self.ports_path, retrieve_all, [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1198.168741] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] for r in self._pagination(collection, path, **params): [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] res = self.get(path, params=params) [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.retry_request("GET", action, body=body, [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] return self.do_request(method, action, body=body, [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] ret = obj(*args, **kwargs) [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] self._handle_fault_response(status_code, replybody, resp) [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] raise exception.Unauthorized() [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] nova.exception.Unauthorized: Not authorized. [ 1198.169916] env[60175]: ERROR nova.compute.manager [instance: a45c150e-942b-454a-ab59-aa6b191bfada] [ 1198.186172] env[60175]: DEBUG oslo_concurrency.lockutils [None req-901b08dc-1d34-49d1-8c17-6cd84999a6c2 tempest-ServerMetadataTestJSON-266004654 tempest-ServerMetadataTestJSON-266004654-project-member] Lock "a45c150e-942b-454a-ab59-aa6b191bfada" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 319.770s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1198.226009] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1198.226769] env[60175]: ERROR nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] result = getattr(controller, method)(*args, **kwargs) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._get(image_id) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] resp, body = self.http_client.get(url, headers=header) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.request(url, 'GET', **kwargs) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._handle_response(resp) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exc.from_response(resp, resp.content) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] During handling of the above exception, another exception occurred: [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] yield resources [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.driver.spawn(context, instance, image_meta, [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._fetch_image_if_missing(context, vi) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image_fetch(context, vi, tmp_image_ds_loc) [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] images.fetch_image( [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1198.226769] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] metadata = IMAGE_API.get(context, image_ref) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return session.show(context, image_id, [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] _reraise_translated_image_exception(image_id) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise new_exc.with_traceback(exc_trace) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] result = getattr(controller, method)(*args, **kwargs) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._get(image_id) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] resp, body = self.http_client.get(url, headers=header) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.request(url, 'GET', **kwargs) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._handle_response(resp) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exc.from_response(resp, resp.content) [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1198.227922] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1198.227922] env[60175]: INFO nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Terminating instance [ 1198.228650] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1198.228792] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1198.229460] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1198.229646] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1198.229864] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96343898-f969-477b-ad7b-1aeda0b95c41 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.232850] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-398dc9ee-6c43-43a9-b41d-5306bfb19950 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.239657] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1198.239886] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f6f89ef5-d92c-4b25-aae1-76d6cd3027ef {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.242311] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1198.242480] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60175) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1198.243442] env[60175]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88eeed8e-dc98-49d1-9a1d-5969232dbc5e {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.248582] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Waiting for the task: (returnval){ [ 1198.248582] env[60175]: value = "session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5255f6ba-fbd2-1344-058c-30f5f431303a" [ 1198.248582] env[60175]: _type = "Task" [ 1198.248582] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1198.255819] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Task: {'id': session[52f8fad9-5bda-a83e-4a4f-0fab9bf5e26f]5255f6ba-fbd2-1344-058c-30f5f431303a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1198.299685] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1198.299892] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1198.300088] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Deleting the datastore file [datastore2] 8f4635d8-5789-4402-8ca2-543b4d4dfc76 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1198.300410] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a8585df7-0aab-44e0-b2e8-e358c3cffdd7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.306575] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Waiting for the task: (returnval){ [ 1198.306575] env[60175]: value = "task-4292985" [ 1198.306575] env[60175]: _type = "Task" [ 1198.306575] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1198.313947] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': task-4292985, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1198.758902] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Preparing fetch location {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1198.759182] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Creating directory with path [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1198.759403] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dd8f5456-80c8-4435-9dfb-76127b8164a7 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.770512] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Created directory with path [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83 {{(pid=60175) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1198.770690] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Fetch image to [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1198.770852] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1198.771552] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f731cf28-8369-4415-abb0-2386d08078ff {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.779131] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5802e19e-480c-492e-b4d6-c4c13fe46fbf {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.787600] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67708a2e-bdb8-413c-af60-d1fa8bcc8cf3 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.819438] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7e7294c-ec8b-46b3-9872-527da9fa8def {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.825811] env[60175]: DEBUG oslo_vmware.api [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Task: {'id': task-4292985, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085951} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1198.827201] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1198.827385] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1198.827549] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1198.827717] env[60175]: INFO nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1198.829435] env[60175]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bbea0b0a-528e-4ea1-9b96-6fda175dea1c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1198.831233] env[60175]: DEBUG nova.compute.claims [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1198.831400] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1198.831605] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1198.855042] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Downloading image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1198.859023] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1198.859645] env[60175]: DEBUG nova.compute.utils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 could not be found. {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1198.860977] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Instance disappeared during build. {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1198.861158] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1198.861320] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1198.861465] env[60175]: DEBUG nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1198.861620] env[60175]: DEBUG nova.network.neutron [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1198.899417] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1198.958305] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Completed reading data from the image iterator. {{(pid=60175) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1198.958506] env[60175]: DEBUG oslo_vmware.rw_handles [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60175) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1198.999285] env[60175]: DEBUG neutronclient.v2_0.client [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60175) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1199.000819] env[60175]: ERROR nova.compute.manager [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] result = getattr(controller, method)(*args, **kwargs) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._get(image_id) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] resp, body = self.http_client.get(url, headers=header) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.request(url, 'GET', **kwargs) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._handle_response(resp) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exc.from_response(resp, resp.content) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] During handling of the above exception, another exception occurred: [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.driver.spawn(context, instance, image_meta, [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._fetch_image_if_missing(context, vi) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image_fetch(context, vi, tmp_image_ds_loc) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] images.fetch_image( [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] metadata = IMAGE_API.get(context, image_ref) [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return session.show(context, image_id, [ 1199.000819] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] _reraise_translated_image_exception(image_id) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise new_exc.with_traceback(exc_trace) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] result = getattr(controller, method)(*args, **kwargs) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._get(image_id) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] resp, body = self.http_client.get(url, headers=header) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.request(url, 'GET', **kwargs) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._handle_response(resp) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exc.from_response(resp, resp.content) [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] nova.exception.ImageNotAuthorized: Not authorized for image ab7fcb5a-745a-4c08-9c04-49b187178f83. [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] During handling of the above exception, another exception occurred: [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._build_and_run_instance(context, instance, image, [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] with excutils.save_and_reraise_exception(): [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.force_reraise() [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise self.value [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] with self.rt.instance_claim(context, instance, node, allocs, [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.abort() [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1199.002235] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return f(*args, **kwargs) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._unset_instance_host_and_node(instance) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] instance.save() [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] updates, result = self.indirection_api.object_action( [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return cctxt.call(context, 'object_action', objinst=objinst, [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] result = self.transport._send( [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._driver.send(target, ctxt, message, [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise result [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] nova.exception_Remote.InstanceNotFound_Remote: Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 could not be found. [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return getattr(target, method)(*args, **kwargs) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return fn(self, *args, **kwargs) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] old_ref, inst_ref = db.instance_update_and_get_original( [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return f(*args, **kwargs) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] with excutils.save_and_reraise_exception() as ectxt: [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.force_reraise() [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise self.value [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return f(*args, **kwargs) [ 1199.003377] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return f(context, *args, **kwargs) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exception.InstanceNotFound(instance_id=uuid) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] nova.exception.InstanceNotFound: Instance 8f4635d8-5789-4402-8ca2-543b4d4dfc76 could not be found. [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] During handling of the above exception, another exception occurred: [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] exception_handler_v20(status_code, error_body) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise client_exc(message=error_message, [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Neutron server returns request_ids: ['req-50d7ccf6-9901-46f8-aa8d-1d60dfcd04c6'] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] During handling of the above exception, another exception occurred: [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] Traceback (most recent call last): [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._deallocate_network(context, instance, requested_networks) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self.network_api.deallocate_for_instance( [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] data = neutron.list_ports(**search_opts) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.list('ports', self.ports_path, retrieve_all, [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1199.004609] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] for r in self._pagination(collection, path, **params): [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] res = self.get(path, params=params) [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.retry_request("GET", action, body=body, [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] return self.do_request(method, action, body=body, [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] ret = obj(*args, **kwargs) [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] self._handle_fault_response(status_code, replybody, resp) [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] raise exception.Unauthorized() [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] nova.exception.Unauthorized: Not authorized. [ 1199.005757] env[60175]: ERROR nova.compute.manager [instance: 8f4635d8-5789-4402-8ca2-543b4d4dfc76] [ 1199.022612] env[60175]: DEBUG oslo_concurrency.lockutils [None req-fb253bb0-2639-43c1-b22e-b2d4ca1b6cbc tempest-MigrationsAdminTest-1576523191 tempest-MigrationsAdminTest-1576523191-project-member] Lock "8f4635d8-5789-4402-8ca2-543b4d4dfc76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 298.806s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1230.950979] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.951375] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Starting heal instance info cache {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1230.951375] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Rebuilding the list of instances to heal {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1230.961835] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Skipping network cache update for instance because it is Building. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1230.961996] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Didn't find any instances for network info cache update. {{(pid=60175) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1230.962211] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1231.950176] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1231.962056] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1231.962056] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1231.962443] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1231.962443] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60175) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1231.963331] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b2f7d7-1d05-45da-bd84-c4c06ee62cbc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.973344] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-513fcd19-0b6f-410b-b163-f24603c1c679 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.988011] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c0f08ec-86c6-44c1-8387-a45ad9238cc1 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.993762] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-463f7fe2-79d1-4efd-9f9c-22ec37f056b8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1232.023813] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180710MB free_disk=146GB free_vcpus=48 pci_devices=None {{(pid=60175) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1232.024275] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1232.024611] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1232.061881] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Instance d2ff993d-35d8-479c-bb3e-2c06080896d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60175) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1232.061881] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1232.062086] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=149GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60175) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1232.089458] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99fa7a56-915c-48a0-8c9d-61c480bf2fdc {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1232.097043] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97e862ce-2b12-4dc5-955e-c05ff901d4f9 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1232.127987] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-857a3224-913a-43c4-986d-6a87427d71a6 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1232.136063] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a041d9-d5fe-41a1-9d9d-7dc44a38f7e8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1232.148268] env[60175]: DEBUG nova.compute.provider_tree [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1232.156965] env[60175]: DEBUG nova.scheduler.client.report [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1232.171482] env[60175]: DEBUG nova.compute.resource_tracker [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60175) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1232.171911] env[60175]: DEBUG oslo_concurrency.lockutils [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1233.172472] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.946695] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.949329] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.949513] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.949678] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1235.949830] env[60175]: DEBUG nova.compute.manager [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60175) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1237.951354] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1240.945382] env[60175]: DEBUG oslo_service.periodic_task [None req-99c797d3-5983-40f1-9e6a-449443a1ecb3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60175) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1246.747067] env[60175]: WARNING oslo_vmware.rw_handles [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles response.begin() [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1246.747067] env[60175]: ERROR oslo_vmware.rw_handles [ 1246.747836] env[60175]: DEBUG nova.virt.vmwareapi.images [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Downloaded image file data ab7fcb5a-745a-4c08-9c04-49b187178f83 to vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk on the data store datastore2 {{(pid=60175) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1246.749486] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Caching image {{(pid=60175) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1246.752063] env[60175]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Copying Virtual Disk [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/tmp-sparse.vmdk to [datastore2] vmware_temp/d637dc45-edda-40b0-806b-54db38df4a6c/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk {{(pid=60175) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1246.752063] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c70b5ab2-c983-4e3a-86cf-98eb18db3ac8 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1246.758468] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Waiting for the task: (returnval){ [ 1246.758468] env[60175]: value = "task-4292986" [ 1246.758468] env[60175]: _type = "Task" [ 1246.758468] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1246.766228] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Task: {'id': task-4292986, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1247.270103] env[60175]: DEBUG oslo_vmware.exceptions [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Fault InvalidArgument not matched. {{(pid=60175) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1247.270103] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ab7fcb5a-745a-4c08-9c04-49b187178f83/ab7fcb5a-745a-4c08-9c04-49b187178f83.vmdk" {{(pid=60175) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1247.270103] env[60175]: ERROR nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1247.270103] env[60175]: Faults: ['InvalidArgument'] [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Traceback (most recent call last): [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] yield resources [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self.driver.spawn(context, instance, image_meta, [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self._fetch_image_if_missing(context, vi) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] image_cache(vi, tmp_image_ds_loc) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] vm_util.copy_virtual_disk( [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] session._wait_for_task(vmdk_copy_task) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return self.wait_for_task(task_ref) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return evt.wait() [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] result = hub.switch() [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return self.greenlet.switch() [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self.f(*self.args, **self.kw) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] raise exceptions.translate_fault(task_info.error) [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Faults: ['InvalidArgument'] [ 1247.270103] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] [ 1247.271216] env[60175]: INFO nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Terminating instance [ 1247.273064] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Start destroying the instance on the hypervisor. {{(pid=60175) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1247.273424] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Destroying instance {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1247.274275] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de4871c0-e3de-4a18-b47b-e4d0f975c2bb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.280935] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Unregistering the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1247.281278] env[60175]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d46d5e1b-8e30-4221-b7e0-b873cdbcb707 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.345927] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Unregistered the VM {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1247.346457] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Deleting contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1247.346759] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Deleting the datastore file [datastore2] d2ff993d-35d8-479c-bb3e-2c06080896d0 {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1247.347133] env[60175]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3fc56e7f-54cc-4f51-a874-d72ef9cb39cb {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.354826] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Waiting for the task: (returnval){ [ 1247.354826] env[60175]: value = "task-4292988" [ 1247.354826] env[60175]: _type = "Task" [ 1247.354826] env[60175]: } to complete. {{(pid=60175) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1247.361550] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Task: {'id': task-4292988, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1247.864410] env[60175]: DEBUG oslo_vmware.api [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Task: {'id': task-4292988, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070354} completed successfully. {{(pid=60175) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1247.865495] env[60175]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Deleted the datastore file {{(pid=60175) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1247.865495] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Deleted contents of the VM from datastore datastore2 {{(pid=60175) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1247.865629] env[60175]: DEBUG nova.virt.vmwareapi.vmops [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Instance destroyed {{(pid=60175) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1247.865741] env[60175]: INFO nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1247.867951] env[60175]: DEBUG nova.compute.claims [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Aborting claim: {{(pid=60175) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1247.868134] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1247.868347] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1247.932399] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f930a26a-e7d4-445e-8d97-2ead7e902cce {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.939848] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da12d38-ca0d-405b-a7db-b01312655900 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.973406] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e55e835-771a-48bc-9451-0c262f05032c {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.980231] env[60175]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721b9f44-dbc9-4b7e-a410-fb102ec10aa0 {{(pid=60175) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.992896] env[60175]: DEBUG nova.compute.provider_tree [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Inventory has not changed in ProviderTree for provider: 3984c8da-53ad-4889-8d1f-23bab60fa84e {{(pid=60175) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1248.001476] env[60175]: DEBUG nova.scheduler.client.report [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Inventory has not changed for provider 3984c8da-53ad-4889-8d1f-23bab60fa84e based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 268, 'reserved': 0, 'min_unit': 1, 'max_unit': 146, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60175) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1248.014143] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.146s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1248.014765] env[60175]: ERROR nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1248.014765] env[60175]: Faults: ['InvalidArgument'] [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Traceback (most recent call last): [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self.driver.spawn(context, instance, image_meta, [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self._fetch_image_if_missing(context, vi) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] image_cache(vi, tmp_image_ds_loc) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] vm_util.copy_virtual_disk( [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] session._wait_for_task(vmdk_copy_task) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return self.wait_for_task(task_ref) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return evt.wait() [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] result = hub.switch() [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] return self.greenlet.switch() [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] self.f(*self.args, **self.kw) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] raise exceptions.translate_fault(task_info.error) [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Faults: ['InvalidArgument'] [ 1248.014765] env[60175]: ERROR nova.compute.manager [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] [ 1248.015542] env[60175]: DEBUG nova.compute.utils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] VimFaultException {{(pid=60175) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1248.016923] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Build of instance d2ff993d-35d8-479c-bb3e-2c06080896d0 was re-scheduled: A specified parameter was not correct: fileType [ 1248.016923] env[60175]: Faults: ['InvalidArgument'] {{(pid=60175) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1248.017323] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Unplugging VIFs for instance {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1248.017528] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60175) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1248.017743] env[60175]: DEBUG nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Deallocating network for instance {{(pid=60175) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1248.017932] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] deallocate_for_instance() {{(pid=60175) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1248.295718] env[60175]: DEBUG nova.network.neutron [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Updating instance_info_cache with network_info: [] {{(pid=60175) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1248.305786] env[60175]: INFO nova.compute.manager [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] [instance: d2ff993d-35d8-479c-bb3e-2c06080896d0] Took 0.29 seconds to deallocate network for instance. [ 1248.389956] env[60175]: INFO nova.scheduler.client.report [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Deleted allocations for instance d2ff993d-35d8-479c-bb3e-2c06080896d0 [ 1248.405700] env[60175]: DEBUG oslo_concurrency.lockutils [None req-2f66bb28-46af-4c12-b6e7-ef2100de3e72 tempest-ImagesOneServerNegativeTestJSON-36281315 tempest-ImagesOneServerNegativeTestJSON-36281315-project-member] Lock "d2ff993d-35d8-479c-bb3e-2c06080896d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 94.863s {{(pid=60175) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}