[ 627.229381] env[66952]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 627.973682] env[67270]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 629.600387] env[67270]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=67270) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 629.600803] env[67270]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=67270) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 629.600803] env[67270]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=67270) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 629.601096] env[67270]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 629.602438] env[67270]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 629.736151] env[67270]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=67270) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 629.747577] env[67270]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=67270) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 629.860728] env[67270]: INFO nova.virt.driver [None req-0eb54f69-3976-4968-9222-84f63a8f2112 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 629.948069] env[67270]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.948280] env[67270]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.948399] env[67270]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=67270) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 633.393336] env[67270]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-e771372a-c0d8-429b-ae34-27e415d0b496 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.409927] env[67270]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=67270) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 633.410114] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-5f5dd695-bccd-4981-ac08-e3188af45d74 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.438309] env[67270]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 0bee4. [ 633.438511] env[67270]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.490s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.439154] env[67270]: INFO nova.virt.vmwareapi.driver [None req-0eb54f69-3976-4968-9222-84f63a8f2112 None None] VMware vCenter version: 7.0.3 [ 633.442638] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98f6ebde-9472-4128-9989-75fcbf41572a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.461250] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1825637-fb0c-496a-a88b-eaa12faef14e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.468055] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb1124f0-2776-4532-a7ff-4bd2c4688b80 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.475774] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b81798b4-c99a-439e-8d5e-bb8a2c7ce679 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.490693] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dde7d6f-aa34-4524-8895-c59e6532a2d8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.497574] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d00977-044e-499b-b40b-7225b0aab7fc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.528811] env[67270]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-dd47043b-fe30-4186-90a8-ff6c8eadf94a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.535179] env[67270]: DEBUG nova.virt.vmwareapi.driver [None req-0eb54f69-3976-4968-9222-84f63a8f2112 None None] Extension org.openstack.compute already exists. {{(pid=67270) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 633.537919] env[67270]: INFO nova.compute.provider_config [None req-0eb54f69-3976-4968-9222-84f63a8f2112 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 633.555143] env[67270]: DEBUG nova.context [None req-0eb54f69-3976-4968-9222-84f63a8f2112 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),402f5873-bf17-4304-a507-a5be74f6a837(cell1) {{(pid=67270) load_cells /opt/stack/nova/nova/context.py:464}} [ 633.557199] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.557429] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.558197] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.558591] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Acquiring lock "402f5873-bf17-4304-a507-a5be74f6a837" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.558785] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Lock "402f5873-bf17-4304-a507-a5be74f6a837" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.559841] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Lock "402f5873-bf17-4304-a507-a5be74f6a837" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.573414] env[67270]: DEBUG oslo_db.sqlalchemy.engines [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67270) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 633.573819] env[67270]: DEBUG oslo_db.sqlalchemy.engines [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=67270) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 633.580999] env[67270]: ERROR nova.db.main.api [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.580999] env[67270]: result = function(*args, **kwargs) [ 633.580999] env[67270]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.580999] env[67270]: return func(*args, **kwargs) [ 633.580999] env[67270]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 633.580999] env[67270]: result = fn(*args, **kwargs) [ 633.580999] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 633.580999] env[67270]: return f(*args, **kwargs) [ 633.580999] env[67270]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 633.580999] env[67270]: return db.service_get_minimum_version(context, binaries) [ 633.580999] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 633.580999] env[67270]: _check_db_access() [ 633.580999] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 633.580999] env[67270]: stacktrace = ''.join(traceback.format_stack()) [ 633.580999] env[67270]: [ 633.581694] env[67270]: ERROR nova.db.main.api [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.581694] env[67270]: result = function(*args, **kwargs) [ 633.581694] env[67270]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.581694] env[67270]: return func(*args, **kwargs) [ 633.581694] env[67270]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 633.581694] env[67270]: result = fn(*args, **kwargs) [ 633.581694] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 633.581694] env[67270]: return f(*args, **kwargs) [ 633.581694] env[67270]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 633.581694] env[67270]: return db.service_get_minimum_version(context, binaries) [ 633.581694] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 633.581694] env[67270]: _check_db_access() [ 633.581694] env[67270]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 633.581694] env[67270]: stacktrace = ''.join(traceback.format_stack()) [ 633.581694] env[67270]: [ 633.582074] env[67270]: WARNING nova.objects.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 633.582248] env[67270]: WARNING nova.objects.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Failed to get minimum service version for cell 402f5873-bf17-4304-a507-a5be74f6a837 [ 633.582710] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Acquiring lock "singleton_lock" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.582874] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Acquired lock "singleton_lock" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.583139] env[67270]: DEBUG oslo_concurrency.lockutils [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Releasing lock "singleton_lock" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.583484] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Full set of CONF: {{(pid=67270) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 633.583632] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ******************************************************************************** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 633.583762] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] Configuration options gathered from: {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 633.583897] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 633.584107] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 633.584239] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ================================================================================ {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 633.584450] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] allow_resize_to_same_host = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.584631] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] arq_binding_timeout = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.584764] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] backdoor_port = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.584891] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] backdoor_socket = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585066] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] block_device_allocate_retries = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585233] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] block_device_allocate_retries_interval = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585403] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cert = self.pem {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585579] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585741] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute_monitors = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.585908] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] config_dir = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586091] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] config_drive_format = iso9660 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586259] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586445] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] config_source = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586618] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] console_host = devstack {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586783] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] control_exchange = nova {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.586943] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cpu_allocation_ratio = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.587125] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] daemon = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.587290] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] debug = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.587454] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] default_access_ip_network_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.587615] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] default_availability_zone = nova {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.587770] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] default_ephemeral_format = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588024] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588197] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] default_schedule_zone = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588392] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] disk_allocation_ratio = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588567] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] enable_new_services = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588750] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] enabled_apis = ['osapi_compute'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.588914] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] enabled_ssl_apis = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589086] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] flat_injected = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589254] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] force_config_drive = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589414] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] force_raw_images = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589583] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] graceful_shutdown_timeout = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589747] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] heal_instance_info_cache_interval = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.589971] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] host = cpu-1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.590158] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] initial_cpu_allocation_ratio = 4.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.590323] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] initial_disk_allocation_ratio = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.590488] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] initial_ram_allocation_ratio = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.590703] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.590869] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_build_timeout = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591040] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_delete_interval = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591213] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_format = [instance: %(uuid)s] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591380] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_name_template = instance-%08x {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591545] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_usage_audit = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591717] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_usage_audit_period = month {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.591880] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592056] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] instances_path = /opt/stack/data/nova/instances {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592226] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] internal_service_availability_zone = internal {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592387] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] key = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592549] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] live_migration_retry_count = 30 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592712] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_config_append = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.592878] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593046] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_dir = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593210] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593341] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_options = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593505] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_rotate_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593677] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_rotate_interval_type = days {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593844] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] log_rotation_type = none {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.593974] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594109] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594281] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594453] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594572] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594734] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] long_rpc_timeout = 1800 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.594890] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_concurrent_builds = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595056] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_concurrent_live_migrations = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595215] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_concurrent_snapshots = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595372] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_local_block_devices = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595532] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_logfile_count = 30 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595687] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] max_logfile_size_mb = 200 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.595845] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] maximum_instance_delete_attempts = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596012] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metadata_listen = 0.0.0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596210] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metadata_listen_port = 8775 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596399] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metadata_workers = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596565] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] migrate_max_retries = -1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596733] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] mkisofs_cmd = genisoimage {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.596945] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] my_block_storage_ip = 10.180.1.21 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597090] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] my_ip = 10.180.1.21 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597255] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] network_allocate_retries = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597438] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597606] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] osapi_compute_listen = 0.0.0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597769] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] osapi_compute_listen_port = 8774 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.597938] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] osapi_compute_unique_server_name_scope = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598116] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] osapi_compute_workers = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598307] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] password_length = 12 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598483] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] periodic_enable = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598650] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] periodic_fuzzy_delay = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598820] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] pointer_model = usbtablet {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.598988] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] preallocate_images = none {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599163] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] publish_errors = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599313] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] pybasedir = /opt/stack/nova {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599486] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ram_allocation_ratio = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599647] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rate_limit_burst = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599812] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rate_limit_except_level = CRITICAL {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.599972] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rate_limit_interval = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.600145] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reboot_timeout = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.600306] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reclaim_instance_interval = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.600511] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] record = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.600759] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reimage_timeout_per_gb = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.600938] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] report_interval = 120 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.601183] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rescue_timeout = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.601398] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reserved_host_cpus = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.601625] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reserved_host_disk_mb = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.601855] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reserved_host_memory_mb = 512 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.602112] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] reserved_huge_pages = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.602319] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] resize_confirm_window = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.602522] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] resize_fs_using_block_device = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.602718] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] resume_guests_state_on_host_boot = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.602894] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603071] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rpc_response_timeout = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603240] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] run_external_periodic_tasks = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603412] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] running_deleted_instance_action = reap {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603575] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] running_deleted_instance_poll_interval = 1800 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603735] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] running_deleted_instance_timeout = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.603894] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler_instance_sync_interval = 120 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604042] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_down_time = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604219] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] servicegroup_driver = db {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604381] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] shelved_offload_time = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604541] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] shelved_poll_interval = 3600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604709] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] shutdown_timeout = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.604871] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] source_is_ipv6 = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605040] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ssl_only = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605306] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605476] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] sync_power_state_interval = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605638] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] sync_power_state_pool_size = 1000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] syslog_log_facility = LOG_USER {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.605957] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] tempdir = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606127] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] timeout_nbd = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606299] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] transport_url = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606458] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] update_resources_interval = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606616] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_cow_images = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606773] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_eventlog = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.606929] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_journal = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607097] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_json = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607260] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_rootwrap_daemon = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607420] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_stderr = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607577] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] use_syslog = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607732] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vcpu_pin_set = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.607897] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plugging_is_fatal = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608083] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plugging_timeout = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608256] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] virt_mkfs = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608414] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] volume_usage_poll_interval = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608576] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] watch_log_file = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608746] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] web = /usr/share/spice-html5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 633.608935] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_concurrency.disable_process_locking = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.609288] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.609499] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.609671] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.609845] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_metrics.metrics_process_name = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610021] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610189] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610370] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.auth_strategy = keystone {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610536] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.compute_link_prefix = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610714] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.610887] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.dhcp_domain = novalocal {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611068] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.enable_instance_password = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611235] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.glance_link_prefix = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611401] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611574] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.instance_list_cells_batch_strategy = distributed {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611739] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.instance_list_per_project_cells = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.611900] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.list_records_by_skipping_down_cells = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612076] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.local_metadata_per_cell = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612248] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.max_limit = 1000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612415] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.metadata_cache_expiration = 15 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612589] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.neutron_default_tenant_id = default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612792] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.use_forwarded_for = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.612968] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.use_neutron_default_nets = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.613152] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.613319] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_dynamic_failure_fatal = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.613489] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.613663] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_dynamic_ssl_certfile = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.613836] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_dynamic_targets = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614078] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_jsonfile_path = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614206] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api.vendordata_providers = ['StaticJSON'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614404] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.backend = dogpile.cache.memcached {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614577] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.backend_argument = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614753] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.config_prefix = cache.oslo {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.614926] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.dead_timeout = 60.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615106] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.debug_cache_backend = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615274] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.enable_retry_client = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615440] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.enable_socket_keepalive = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615615] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.enabled = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615783] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.expiration_time = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.615950] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.hashclient_retry_attempts = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616134] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.hashclient_retry_delay = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616302] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_dead_retry = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616475] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_password = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616643] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616808] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.616974] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_pool_maxsize = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.617163] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_pool_unused_timeout = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.617345] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_sasl_enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.617534] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_servers = ['localhost:11211'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.617704] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_socket_timeout = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.617876] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.memcache_username = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618056] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.proxies = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618228] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.retry_attempts = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618428] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.retry_delay = 0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618604] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.socket_keepalive_count = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618814] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.socket_keepalive_idle = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.618993] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.socket_keepalive_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.619171] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.tls_allowed_ciphers = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.619359] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.tls_cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.619533] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.tls_certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.619700] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.tls_enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.619862] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cache.tls_keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620044] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620226] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.auth_type = password {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620389] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620568] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.catalog_info = volumev3::publicURL {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620730] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.620896] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621071] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.cross_az_attach = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621535] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.debug = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621535] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.endpoint_template = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621611] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.http_retries = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621709] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.621870] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622055] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.os_region_name = RegionOne {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622225] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622391] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cinder.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622567] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622729] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.cpu_dedicated_set = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.622887] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.cpu_shared_set = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623068] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.image_type_exclude_list = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623240] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.live_migration_wait_for_vif_plug = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623407] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.max_concurrent_disk_ops = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623573] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.max_disk_devices_to_attach = -1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623736] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.623907] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.624092] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.resource_provider_association_refresh = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.624261] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.shutdown_retry_interval = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.624444] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.624666] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] conductor.workers = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.624868] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] console.allowed_origins = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625046] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] console.ssl_ciphers = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625226] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] console.ssl_minimum_version = default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625403] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] consoleauth.token_ttl = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625571] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625729] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.625892] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626065] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626229] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626387] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626553] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626709] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.626866] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627032] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627193] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627349] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627519] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.service_type = accelerator {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627680] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627837] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.627992] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.628163] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.628377] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.628546] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] cyborg.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.628735] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.backend = sqlalchemy {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.628916] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.connection = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.629104] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.connection_debug = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.629280] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.connection_parameters = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.629477] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.connection_recycle_time = 3600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.629684] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.connection_trace = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.629857] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.db_inc_retry_interval = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630035] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.db_max_retries = 20 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630206] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.db_max_retry_interval = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630373] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.db_retry_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630547] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.max_overflow = 50 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630714] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.max_pool_size = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.630885] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.max_retries = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631060] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.mysql_enable_ndb = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631240] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.mysql_sql_mode = TRADITIONAL {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631402] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.mysql_wsrep_sync_wait = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631568] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.pool_timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631741] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.retry_interval = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.631903] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.slave_connection = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632084] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.sqlite_synchronous = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632272] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] database.use_db_reconnect = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632447] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.backend = sqlalchemy {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632639] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.connection = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632810] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.connection_debug = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.632985] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.connection_parameters = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633164] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.connection_recycle_time = 3600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633335] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.connection_trace = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633503] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.db_inc_retry_interval = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633667] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.db_max_retries = 20 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633830] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.db_max_retry_interval = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.633992] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.db_retry_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.634177] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.max_overflow = 50 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.634342] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.max_pool_size = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.634514] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.max_retries = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.634678] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.mysql_enable_ndb = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.634849] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.635014] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.mysql_wsrep_sync_wait = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.635194] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.pool_timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637044] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.retry_interval = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637244] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.slave_connection = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637428] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] api_database.sqlite_synchronous = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637617] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] devices.enabled_mdev_types = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637803] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.637971] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ephemeral_storage_encryption.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.638155] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ephemeral_storage_encryption.key_size = 512 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.638362] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.api_servers = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.638543] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.638710] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.638878] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639059] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639265] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639417] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.debug = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639602] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.default_trusted_certificate_ids = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639769] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.enable_certificate_validation = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.639931] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.enable_rbd_download = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640107] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640281] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640456] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640619] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640778] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.640944] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.num_retries = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641131] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.rbd_ceph_conf = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641299] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.rbd_connect_timeout = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641473] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.rbd_pool = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641642] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.rbd_user = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641803] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.641963] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642144] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.service_type = image {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642311] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642472] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642631] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642790] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.642972] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.643152] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.verify_glance_signatures = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.643315] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] glance.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.643545] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] guestfs.debug = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.643717] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.config_drive_cdrom = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.643885] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.config_drive_inject_password = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644066] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644234] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.enable_instance_metrics_collection = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644398] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.enable_remotefx = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644571] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.instances_path_share = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644737] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.iscsi_initiator_list = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.644899] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.limit_cpu_features = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645079] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645246] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645416] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.power_state_check_timeframe = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645581] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.power_state_event_polling_interval = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645751] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.645916] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.use_multipath_io = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.646093] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.volume_attach_retry_count = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.646272] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.volume_attach_retry_interval = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.646449] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.vswitch_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.646615] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.646787] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] mks.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.647177] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.647375] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.manager_interval = 2400 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.647548] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.precache_concurrency = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.647721] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.remove_unused_base_images = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.647891] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648069] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648280] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] image_cache.subdirectory_name = _base {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648468] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.api_max_retries = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648636] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.api_retry_interval = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648798] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.648960] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.auth_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649132] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649316] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649494] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649654] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649813] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.649969] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650147] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650305] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650463] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650619] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650774] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.partition_key = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.650937] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.peer_list = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651104] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651269] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.serial_console_state_timeout = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651427] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651595] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.service_type = baremetal {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651756] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.651911] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.652076] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.652282] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.652495] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.652697] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ironic.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.652917] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.653122] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] key_manager.fixed_key = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.barbican_api_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.barbican_endpoint = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.barbican_endpoint_type = public {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.barbican_region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.number_of_retries = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.retry_delay = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.send_service_user_token = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.657802] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.verify_ssl = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican.verify_ssl_path = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.auth_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658235] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658235] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658235] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658235] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658383] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] barbican_service_user.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658508] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.approle_role_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658674] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.approle_secret_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658835] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.658994] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.659177] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.659347] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.659507] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.659676] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.kv_mountpoint = secret {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.659839] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.kv_version = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660008] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.namespace = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660176] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.root_token_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660344] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660506] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.ssl_ca_crt_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660667] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660827] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.use_ssl = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.660998] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661182] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661343] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661507] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661664] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661821] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.661977] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.662153] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.662312] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.662505] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.662680] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.662881] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663069] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663249] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.service_type = identity {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663414] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663576] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663735] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.663892] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664088] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664256] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] keystone.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664464] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.connection_uri = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664625] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_mode = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664791] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_model_extra_flags = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.664960] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_models = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.665158] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_power_governor_high = performance {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.665334] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_power_governor_low = powersave {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.665502] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_power_management = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.665674] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.665839] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.device_detach_attempts = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666009] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.device_detach_timeout = 20 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666184] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.disk_cachemodes = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666375] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.disk_prefix = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666557] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.enabled_perf_events = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666723] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.file_backed_memory = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.666887] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.gid_maps = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667058] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.hw_disk_discard = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667223] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.hw_machine_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667398] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_rbd_ceph_conf = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667572] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667741] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.667911] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_rbd_glance_store_name = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.668091] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_rbd_pool = rbd {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.668299] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_type = default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.668463] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.images_volume_group = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.668699] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.inject_key = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.668879] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.inject_partition = -2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669093] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.inject_password = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669287] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.iscsi_iface = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669466] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.iser_use_multipath = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669636] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_bandwidth = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669804] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_completion_timeout = 800 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.669970] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_downtime = 500 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.670179] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_downtime_delay = 75 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.670360] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_downtime_steps = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.670545] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_inbound_addr = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.670776] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_permit_auto_converge = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.670966] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_permit_post_copy = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.671155] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_scheme = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.671366] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_timeout_action = abort {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.671540] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_tunnelled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.671703] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_uri = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.671868] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.live_migration_with_native_tls = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.672045] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.max_queues = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.672270] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.mem_stats_period_seconds = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.672465] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.nfs_mount_options = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.672799] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.673034] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_aoe_discover_tries = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.673220] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_iser_scan_tries = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.673441] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_memory_encrypted_guests = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.673659] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_nvme_discover_tries = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.673838] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_pcie_ports = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.674027] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.num_volume_scan_tries = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.674233] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.pmem_namespaces = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.674411] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.quobyte_client_cfg = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.674769] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.674975] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rbd_connect_timeout = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.675181] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.675421] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.675682] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rbd_secret_uuid = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.675911] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rbd_user = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.676111] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.realtime_scheduler_priority = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.676301] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.remote_filesystem_transport = ssh {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.676459] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rescue_image_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.676636] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rescue_kernel_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.676861] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rescue_ramdisk_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.677069] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rng_dev_path = /dev/urandom {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.677240] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.rx_queue_size = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.677413] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.smbfs_mount_options = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.677729] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.677942] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.snapshot_compression = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.678130] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.snapshot_image_format = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.678379] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.678554] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.sparse_logical_volumes = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.678720] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.swtpm_enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.678945] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.swtpm_group = tss {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.679159] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.swtpm_user = tss {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.679375] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.sysinfo_serial = unique {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.679508] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.tx_queue_size = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.679672] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.uid_maps = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.679837] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.use_virtio_for_bridges = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.680081] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.virt_type = kvm {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.680278] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.volume_clear = zero {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.680449] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.volume_clear_size = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.680634] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.volume_use_multipath = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.680836] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_cache_path = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.681065] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.681289] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_mount_group = qemu {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.681481] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_mount_opts = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.681656] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.681933] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.682196] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.vzstorage_mount_user = stack {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.682466] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.682730] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.682995] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.auth_type = password {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.683259] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.683518] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.683772] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.684028] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.684283] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.684557] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.default_floating_pool = public {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.684807] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.685084] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.extension_sync_interval = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.685356] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.http_retries = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.685610] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.685860] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.686125] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.686388] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.metadata_proxy_shared_secret = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.686637] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.686903] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.ovs_bridge = br-int {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.687174] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.physnets = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.687441] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.region_name = RegionOne {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.687711] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.service_metadata_proxy = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.687964] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.688258] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.service_type = network {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.688507] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.688759] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.689024] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.689283] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.689589] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.689832] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] neutron.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.690102] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] notifications.bdms_in_notifications = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.690377] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] notifications.default_level = INFO {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.690641] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] notifications.notification_format = unversioned {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.690881] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] notifications.notify_on_state_change = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.691143] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.691388] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] pci.alias = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.691666] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] pci.device_spec = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.691951] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] pci.report_in_placement = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.692232] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.692487] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.auth_type = password {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.692729] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.auth_url = http://10.180.1.21/identity {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.692962] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.693204] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.693439] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.693632] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.693798] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.693959] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.default_domain_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694133] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.default_domain_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694295] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.domain_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694456] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.domain_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694613] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694775] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.694932] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695101] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695260] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695430] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.password = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695587] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.project_domain_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695752] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.project_domain_name = Default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.695918] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.project_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696102] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.project_name = service {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696275] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.region_name = RegionOne {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696435] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696601] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.service_type = placement {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696763] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.696947] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697132] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697295] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.system_scope = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697454] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697611] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.trust_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697766] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.user_domain_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.697931] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.user_domain_name = Default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.698155] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.user_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.698416] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.username = placement {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.698616] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.698783] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] placement.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.698968] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.cores = 20 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.699151] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.count_usage_from_placement = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.699364] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.699512] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.injected_file_content_bytes = 10240 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.699677] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.injected_file_path_length = 255 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.699875] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.injected_files = 5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.700101] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.instances = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.700311] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.key_pairs = 100 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.700505] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.metadata_items = 128 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.700678] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.ram = 51200 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.700859] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.recheck_quota = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.701082] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.server_group_members = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.701264] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] quota.server_groups = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.701439] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rdp.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.701808] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702016] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702196] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702362] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.image_metadata_prefilter = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702529] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702695] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.max_attempts = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.702859] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.max_placement_results = 1000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703036] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703243] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.query_placement_for_availability_zone = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703372] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.query_placement_for_image_type_support = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703536] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703711] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] scheduler.workers = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.703883] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.704064] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.704253] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.704424] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.704636] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.704810] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705015] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705219] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705389] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.host_subset_size = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705550] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.image_properties_default_architecture = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705715] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.705884] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.isolated_hosts = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.706151] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.isolated_images = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.706410] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.max_instances_per_host = 50 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.706541] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.706701] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.pci_in_placement = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.706865] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707040] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707212] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707377] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707576] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707774] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.707945] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.track_instance_changes = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.708200] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.708467] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metrics.required = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.708668] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metrics.weight_multiplier = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.708856] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metrics.weight_of_unavailable = -10000.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.709056] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] metrics.weight_setting = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.709390] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.709568] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.709748] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.port_range = 10000:20000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.709921] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.710151] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.710332] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] serial_console.serialproxy_port = 6083 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.710508] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.710687] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.auth_type = password {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.710849] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711013] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711467] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711467] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711544] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711725] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.send_service_user_token = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.711940] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.712139] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] service_user.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.712365] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.agent_enabled = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.712634] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713067] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713293] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.html5proxy_host = 0.0.0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713489] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.html5proxy_port = 6082 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713651] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.image_compression = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713814] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.jpeg_compression = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.713974] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.playback_compression = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714161] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.server_listen = 127.0.0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714335] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714498] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.streaming_mode = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714656] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] spice.zlib_compression = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714827] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] upgrade_levels.baseapi = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.714991] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] upgrade_levels.cert = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.715175] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] upgrade_levels.compute = auto {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.715339] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] upgrade_levels.conductor = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.715506] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] upgrade_levels.scheduler = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.715676] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.715841] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.auth_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716024] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716182] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716346] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716509] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716669] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.716848] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.717059] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vendordata_dynamic_auth.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.717267] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.api_retry_count = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.717466] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.ca_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.717700] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.cache_prefix = devstack-image-cache {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.717911] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.cluster_name = testcl1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.718109] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.connection_pool_size = 10 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.718303] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.console_delay_seconds = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.718490] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.datastore_regex = ^datastore.* {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.718721] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.718901] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.host_password = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719084] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.host_port = 443 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719272] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.host_username = administrator@vsphere.local {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719483] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.insecure = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719647] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.integration_bridge = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719815] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.maximum_objects = 100 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.719977] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.pbm_default_policy = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.720158] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.pbm_enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.720321] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.pbm_wsdl_location = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.720492] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.720652] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.serial_port_proxy_uri = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.720812] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.serial_port_service_uri = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721028] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.task_poll_interval = 0.5 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721216] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.use_linked_clone = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721391] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.vnc_keymap = en-us {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721561] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.vnc_port = 5900 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721755] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vmware.vnc_port_total = 10000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.721979] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.auth_schemes = ['none'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.722179] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.722500] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.722692] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.722864] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.novncproxy_port = 6080 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723062] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.server_listen = 127.0.0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723246] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723410] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.vencrypt_ca_certs = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723573] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.vencrypt_client_cert = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723734] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vnc.vencrypt_client_key = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.723917] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.724095] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_deep_image_inspection = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.724329] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_fallback_pcpu_query = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.724503] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_group_policy_check_upcall = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.724671] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.724850] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.disable_rootwrap = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725044] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.enable_numa_live_migration = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725215] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725380] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725556] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.handle_virt_lifecycle_events = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725713] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.libvirt_disable_apic = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.725874] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.never_download_image_if_on_rbd = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726050] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726240] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726418] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726585] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726745] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.726905] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727078] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727274] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727461] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727650] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727820] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.client_socket_timeout = 900 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.727987] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.default_pool_size = 1000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.728173] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.keep_alive = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.728373] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.max_header_line = 16384 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.728553] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.secure_proxy_ssl_header = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.728719] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.ssl_ca_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.728911] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.ssl_cert_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.729103] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.ssl_key_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.729296] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.tcp_keepidle = 600 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.729490] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.729662] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] zvm.ca_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.729824] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] zvm.cloud_connector_url = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.730153] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.730330] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] zvm.reachable_timeout = 300 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.730516] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.enforce_new_defaults = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.730713] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.enforce_scope = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.730914] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.policy_default_rule = default {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.731175] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.731382] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.policy_file = policy.yaml {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.731589] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.731848] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.732133] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.732368] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.732555] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.732734] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.732960] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.733175] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.connection_string = messaging:// {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.733352] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.enabled = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.733527] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.es_doc_type = notification {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.733696] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.es_scroll_size = 10000 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.733869] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.es_scroll_time = 2m {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734044] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.filter_error_trace = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734218] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.hmac_keys = SECRET_KEY {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734391] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.sentinel_service_name = mymaster {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734569] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.socket_timeout = 0.1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734789] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] profiler.trace_sqlalchemy = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.734971] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] remote_debug.host = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.735165] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] remote_debug.port = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.735360] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.735593] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.735809] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.736014] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.736312] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.736549] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.736819] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.737115] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.737414] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.737663] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.737868] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738056] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738239] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738436] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738611] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738792] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.738958] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.739137] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.739351] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.739600] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.739789] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.739964] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.740161] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.740345] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.740523] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.740697] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.740874] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741058] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741246] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741424] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741599] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_rabbit.ssl_version = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741799] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.741986] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_notifications.retry = -1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.742196] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.742377] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_messaging_notifications.transport_url = **** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.742552] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.auth_section = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.742717] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.auth_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.742876] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.cafile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743043] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.certfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743214] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.collect_timing = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743373] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.connect_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743531] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.connect_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743688] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.endpoint_id = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.743845] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.endpoint_override = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744012] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.insecure = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744204] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.keyfile = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744371] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.max_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744529] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.min_version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744685] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.region_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744840] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.service_name = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.744999] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.service_type = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745178] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.split_loggers = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745338] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.status_code_retries = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745500] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.status_code_retry_delay = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745658] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.timeout = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745817] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.valid_interfaces = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.745976] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_limit.version = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746163] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_reports.file_event_handler = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746327] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_reports.file_event_handler_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746486] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] oslo_reports.log_dir = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746656] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746815] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.group = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.746973] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747154] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747321] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747481] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_linux_bridge_privileged.user = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747651] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747812] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.group = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.747970] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.helper_command = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.748152] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.748356] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.748524] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] vif_plug_ovs_privileged.user = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.748698] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.flat_interface = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.748883] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749069] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749253] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749424] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749594] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749763] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.749926] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_linux_bridge.vlan_interface = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.750121] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.isolate_vif = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.750298] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.750468] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.750663] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.750852] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.ovsdb_interface = native {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751029] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_vif_ovs.per_port_bridge = False {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751204] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_brick.lock_path = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751375] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751541] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] os_brick.wait_mpath_device_interval = 1 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751711] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.capabilities = [21] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.751869] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.group = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752043] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.helper_command = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752214] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752380] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.thread_pool_size = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752540] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] privsep_osbrick.user = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752710] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.752868] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.group = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.753031] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.helper_command = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.753199] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.753360] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.thread_pool_size = 8 {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.753517] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] nova_sys_admin.user = None {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 633.753644] env[67270]: DEBUG oslo_service.service [None req-72ae0464-167d-4283-abc3-24c908c07e9d None None] ******************************************************************************** {{(pid=67270) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 633.754090] env[67270]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 633.763898] env[67270]: INFO nova.virt.node [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Generated node identity ddbaf518-603f-4953-8d5d-25c9ed7292bd [ 633.764165] env[67270]: INFO nova.virt.node [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Wrote node identity ddbaf518-603f-4953-8d5d-25c9ed7292bd to /opt/stack/data/n-cpu-1/compute_id [ 633.777717] env[67270]: WARNING nova.compute.manager [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Compute nodes ['ddbaf518-603f-4953-8d5d-25c9ed7292bd'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 633.812675] env[67270]: INFO nova.compute.manager [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 633.836963] env[67270]: WARNING nova.compute.manager [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 633.837220] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.837446] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.837591] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.837750] env[67270]: DEBUG nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 633.838956] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c259ed-fe6e-4c09-8764-ac3ce1bc9216 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.848190] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eee18280-f52a-4414-b204-1e4de50445cf {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.862839] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec59552f-7134-4486-a86e-9cef24df5d64 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.870012] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac33582-fa1a-48f5-8192-c48344759cde {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.900733] env[67270]: DEBUG nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180800MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 633.900938] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.901032] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.914107] env[67270]: WARNING nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] No compute node record for cpu-1:ddbaf518-603f-4953-8d5d-25c9ed7292bd: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ddbaf518-603f-4953-8d5d-25c9ed7292bd could not be found. [ 633.929386] env[67270]: INFO nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: ddbaf518-603f-4953-8d5d-25c9ed7292bd [ 633.983080] env[67270]: DEBUG nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 633.983300] env[67270]: DEBUG nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 634.091400] env[67270]: INFO nova.scheduler.client.report [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] [req-b2ee6eaa-bdfc-4bc3-9247-e6251bd5b006] Created resource provider record via placement API for resource provider with UUID ddbaf518-603f-4953-8d5d-25c9ed7292bd and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 634.109210] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa46638-a14f-4616-9dd1-2c35b3e1acf4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.118429] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0569b52c-e012-4ecd-9fe5-90ecb04149a5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.163663] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a15bb101-336f-446d-a29e-896653767913 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.173189] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a002d98c-e75a-4872-aacc-2a6d398250d2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.187453] env[67270]: DEBUG nova.compute.provider_tree [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Updating inventory in ProviderTree for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 634.223489] env[67270]: DEBUG nova.scheduler.client.report [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Updated inventory for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 634.223744] env[67270]: DEBUG nova.compute.provider_tree [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Updating resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd generation from 0 to 1 during operation: update_inventory {{(pid=67270) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 634.223890] env[67270]: DEBUG nova.compute.provider_tree [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Updating inventory in ProviderTree for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 634.270143] env[67270]: DEBUG nova.compute.provider_tree [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Updating resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd generation from 1 to 2 during operation: update_traits {{(pid=67270) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 634.288949] env[67270]: DEBUG nova.compute.resource_tracker [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 634.289161] env[67270]: DEBUG oslo_concurrency.lockutils [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.289337] env[67270]: DEBUG nova.service [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Creating RPC server for service compute {{(pid=67270) start /opt/stack/nova/nova/service.py:182}} [ 634.303526] env[67270]: DEBUG nova.service [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] Join ServiceGroup membership for this service compute {{(pid=67270) start /opt/stack/nova/nova/service.py:199}} [ 634.303724] env[67270]: DEBUG nova.servicegroup.drivers.db [None req-88994e09-9d2f-475e-b261-985af87d7a4d None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=67270) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 676.306960] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.322819] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Getting list of instances from cluster (obj){ [ 676.322819] env[67270]: value = "domain-c8" [ 676.322819] env[67270]: _type = "ClusterComputeResource" [ 676.322819] env[67270]: } {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 676.324292] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6144ec23-ecf9-4704-80d9-49e2ba29ae23 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.336581] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Got total of 0 instances {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 676.337642] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.338111] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Getting list of instances from cluster (obj){ [ 676.338111] env[67270]: value = "domain-c8" [ 676.338111] env[67270]: _type = "ClusterComputeResource" [ 676.338111] env[67270]: } {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 676.340074] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d7b9936-fcb5-4b75-b071-87946b7bb85d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.349778] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Got total of 0 instances {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 678.529974] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "c2867798-9109-4f85-ae60-3830a711f21f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.530303] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "c2867798-9109-4f85-ae60-3830a711f21f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.551124] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 678.672545] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.672835] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.674613] env[67270]: INFO nova.compute.claims [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 678.813128] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42acc6f1-b961-491f-af3a-aa1d5c215f4e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.828933] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24c34245-b0f4-471b-8b2c-1cf4c7b55bb5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.873630] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5314c15e-f930-4679-9554-c0cfe9f78727 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.882143] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2211234-1d80-402f-92ea-a2025bd34fe5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.898213] env[67270]: DEBUG nova.compute.provider_tree [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 678.908536] env[67270]: DEBUG nova.scheduler.client.report [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 678.938379] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.938379] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 678.980511] env[67270]: DEBUG nova.compute.utils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 678.981950] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 678.982449] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 679.002441] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 679.096105] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 679.621809] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "891481a1-edb6-4111-9779-23ba64d85dce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.621809] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "891481a1-edb6-4111-9779-23ba64d85dce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.645397] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 679.701966] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.702967] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.705467] env[67270]: INFO nova.compute.claims [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 679.816376] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc0188b-6e4d-4dec-85e1-33523beaee6e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.826965] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28457c74-4320-424d-bbba-3ad822a343c1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.865152] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ffcd371-8037-4fbd-a1ec-67b13cd26c3f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.873587] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab4ade2b-4e6d-453a-bbf6-fc31065f03c9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.891099] env[67270]: DEBUG nova.compute.provider_tree [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 679.902483] env[67270]: DEBUG nova.scheduler.client.report [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 679.922540] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.923030] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 679.964928] env[67270]: DEBUG nova.compute.utils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 679.964928] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Not allocating networking since 'none' was specified. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 679.980241] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 680.050678] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 680.050933] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 680.051103] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 680.051884] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 680.051884] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 680.051884] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 680.051884] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 680.052081] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 680.052586] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 680.052671] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 680.052856] env[67270]: DEBUG nova.virt.hardware [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 680.054056] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fb67153-73b1-4bd8-9e65-c80db981095e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.068490] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a60c4b2-2129-40d4-9148-69c41ebf61ee {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.091564] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e7df9a-9880-4f6b-b040-01af5dd6f1ff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.114575] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 680.151998] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 680.153216] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 680.153216] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 680.153216] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 680.153216] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 680.153216] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 680.153453] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 680.153453] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 680.153453] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 680.153589] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 680.153760] env[67270]: DEBUG nova.virt.hardware [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 680.154962] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2083da45-5a85-49f5-96f3-0d01fe120294 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.166355] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8e339c-683e-4046-af12-b82d0722d358 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.183115] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Instance VIF info [] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 680.193715] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.193715] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-84c8b6ba-b512-4af2-b9a0-4cf305b4c725 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.217442] env[67270]: DEBUG nova.policy [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce732eba493a40f38a261890928afc66', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ee7ebb60622f48bb974035d46b75c62f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 680.220997] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Created folder: OpenStack in parent group-v4. [ 680.221884] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating folder: Project (b43fb77ab72642ce8b96abbcf355fe27). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.222230] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b51b71de-cfcc-4f9f-87c0-1915484db58a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.236062] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Created folder: Project (b43fb77ab72642ce8b96abbcf355fe27) in parent group-v814248. [ 680.236257] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating folder: Instances. Parent ref: group-v814249. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.236510] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0a847585-972c-4a5d-a146-83c4fc18ff62 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.248216] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Created folder: Instances in parent group-v814249. [ 680.248531] env[67270]: DEBUG oslo.service.loopingcall [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 680.248917] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 680.249167] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-333646f4-fe71-4c8b-9ae7-97dd0e3a8257 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.272909] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 680.272909] env[67270]: value = "task-4110550" [ 680.272909] env[67270]: _type = "Task" [ 680.272909] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.287196] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110550, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.786357] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110550, 'name': CreateVM_Task} progress is 99%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.250713] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Successfully created port: c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 681.290396] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110550, 'name': CreateVM_Task} progress is 99%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.612507] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "a51d9480-1aa1-48c9-a05c-943589d6a224" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.612784] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.633313] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 681.696033] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.696306] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.698412] env[67270]: INFO nova.compute.claims [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 681.794953] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110550, 'name': CreateVM_Task, 'duration_secs': 1.296404} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 681.795267] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 681.798175] env[67270]: DEBUG oslo_vmware.service [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da32a3f9-a4f1-40bc-893b-da9f76e5ec68 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.808656] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 681.808914] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 681.811020] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 681.811020] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-74ebb68c-495a-4374-992c-b25f866b6d5d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.817052] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for the task: (returnval){ [ 681.817052] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5205df2c-2403-fe63-5395-eb2dc6aab637" [ 681.817052] env[67270]: _type = "Task" [ 681.817052] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.828964] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5205df2c-2403-fe63-5395-eb2dc6aab637, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.849803] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d663afbb-524f-4ce0-8f59-dc06d8c6fc5c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.858908] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e987641a-3ab0-46b5-82b1-b1650cf32754 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.897286] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f05de75b-ed03-4b5b-a1cf-fdd836d54c82 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.906816] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c602af33-e3ae-44a0-aa34-0eadbd747222 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.924166] env[67270]: DEBUG nova.compute.provider_tree [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.935378] env[67270]: DEBUG nova.scheduler.client.report [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.957068] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.957677] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 682.002491] env[67270]: DEBUG nova.compute.utils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 682.008233] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 682.008233] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 682.020316] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 682.138127] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 682.168477] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 682.168477] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 682.168477] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 682.168901] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 682.168901] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 682.168901] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 682.168901] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 682.169090] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 682.169242] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 682.169509] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 682.169593] env[67270]: DEBUG nova.virt.hardware [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 682.170652] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f567135d-a9ec-4805-9b01-07809827042d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.182672] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5719c375-903b-400e-92b2-c5a284e20a41 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.207644] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "4a086288-b773-40aa-b39a-e3f3b9784a05" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.208154] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.223752] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 682.278760] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.278760] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.284037] env[67270]: INFO nova.compute.claims [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 682.286792] env[67270]: DEBUG nova.policy [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7994f3d5ba584324a043e4e4133cdac7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c721a209cc994945ac297086b06cedd5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 682.332842] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.333259] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 682.333401] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.333543] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 682.333948] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 682.334261] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8eafc4dc-edb0-4f38-8cc3-4ec89d6e9bd8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.351173] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 682.351173] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 682.351173] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d139eb-9519-4b46-9c8c-4622cdebd07c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.358989] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ca7ed80-cabd-4678-95d2-46611f60a58a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.366329] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for the task: (returnval){ [ 682.366329] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52743aeb-7934-b94c-1f64-5a4cf2dd8c95" [ 682.366329] env[67270]: _type = "Task" [ 682.366329] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 682.378064] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52743aeb-7934-b94c-1f64-5a4cf2dd8c95, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 682.436696] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53430ba8-982e-4430-83bc-816bb2a199b5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.444785] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e50abd89-422a-4f78-975e-f315bdd7e715 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.482056] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a382de4b-e057-46c5-8ef4-3242bbb382ef {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.491650] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e6cf501-d723-465e-a21d-859e4479c0f8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.507625] env[67270]: DEBUG nova.compute.provider_tree [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 682.525071] env[67270]: DEBUG nova.scheduler.client.report [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 682.547217] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.547750] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 682.600259] env[67270]: DEBUG nova.compute.utils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 682.602814] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 682.602814] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 682.617062] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 682.649728] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "1e482ed7-9c9f-4713-abde-291417686a78" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.650130] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "1e482ed7-9c9f-4713-abde-291417686a78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.660782] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 682.699757] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 682.731178] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 682.731178] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 682.731178] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 682.731753] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 682.731753] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 682.731927] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 682.732263] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 682.732528] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 682.732766] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 682.732986] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 682.733805] env[67270]: DEBUG nova.virt.hardware [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 682.734640] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c70847-e9b8-4238-ba97-92b2bbe5cae5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.739362] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.739843] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.741574] env[67270]: INFO nova.compute.claims [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 682.750690] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-707c9a17-80d9-47d4-b1eb-14c6c5e914c7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.892212] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 682.892510] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating directory with path [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 682.892712] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-38b3f3e9-3392-4180-83dd-4171fb371fdc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.918076] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Created directory with path [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 682.918076] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Fetch image to [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 682.918076] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 682.918076] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c68d694-aa56-4e1a-b1e4-7b72d1938946 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.933726] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6b54a4b-4170-4366-adde-722742e64200 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.945772] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5500bfce-d003-4b44-86c2-b7d20dc191d2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.953886] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d290b1-c723-4561-934d-28bfb08a70da {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.988438] env[67270]: DEBUG nova.policy [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f56dae6330d42548165d3318314145a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbad7749517c40908ca9f1ac15dc514b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 682.990763] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80ba048-02e2-4c05-97c3-a9a2aace30d0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.000709] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-de0309ef-9bb9-4c81-832d-2168fab845f0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.003594] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ecc034-e183-4f47-8ff9-0ea9dd3f1bdd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.037355] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-998d7d53-b177-4af1-b113-493736826a34 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.044609] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 683.049454] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd46e28-24fd-4e33-93ae-4b727976e26d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.064257] env[67270]: DEBUG nova.compute.provider_tree [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 683.077879] env[67270]: DEBUG nova.scheduler.client.report [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 683.106429] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.107573] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 683.137559] env[67270]: DEBUG oslo_vmware.rw_handles [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 683.200295] env[67270]: DEBUG nova.compute.utils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 683.202898] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 683.204082] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 683.205973] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.206217] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.208054] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Successfully updated port: c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 683.209070] env[67270]: DEBUG oslo_vmware.rw_handles [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 683.209238] env[67270]: DEBUG oslo_vmware.rw_handles [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 683.230447] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 683.233904] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 683.233991] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquired lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 683.234443] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 683.235740] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 683.334939] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.335226] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.336750] env[67270]: INFO nova.compute.claims [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 683.346598] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 683.377834] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 683.377834] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 683.377834] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 683.378725] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 683.379035] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 683.379210] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 683.379407] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 683.379571] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 683.379753] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 683.379925] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 683.380093] env[67270]: DEBUG nova.virt.hardware [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 683.380998] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ed6d01d-412f-4bfa-b89a-362ce9a462a7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.394572] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d984e59-4ef4-4c12-ae62-e6b274fd73b7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.512524] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f731fc4b-9a02-4d3c-854b-bd70938d1b11 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.520438] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21b1d0e3-8807-4025-8408-2be808ec0613 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.526733] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 683.553937] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1d2c5a1-bf96-4015-a163-7070451e33fa {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.562397] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40dad420-9d62-4c3c-9de6-8fe4c58dcf93 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.579293] env[67270]: DEBUG nova.compute.provider_tree [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 683.589095] env[67270]: DEBUG nova.scheduler.client.report [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 683.605282] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.605792] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 683.642568] env[67270]: DEBUG nova.compute.utils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 683.643707] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 683.643933] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 683.656078] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 683.690606] env[67270]: DEBUG nova.policy [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b63601b9f66444d95953d6e169c9530', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ef24d5b59ad54caeac45b979a55750c2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 683.733627] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 683.757412] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 683.757653] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 683.757807] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 683.758060] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 683.758586] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 683.758813] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 683.759385] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 683.759385] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 683.759525] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 683.759728] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 683.759941] env[67270]: DEBUG nova.virt.hardware [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 683.760869] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33f05509-1911-4e11-8254-f223e66fcba8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.770167] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-090c07f6-138b-42ba-99f9-5881feaf78ff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.218304] env[67270]: DEBUG nova.policy [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c31d8bc8fb64b5487c8345f83035a3c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f397f9a74958467bbb12b0dea4e7ef2e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 684.515787] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.516228] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.531034] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 684.598806] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.599114] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.600984] env[67270]: INFO nova.compute.claims [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 684.720493] env[67270]: DEBUG nova.compute.manager [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Received event network-vif-plugged-c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 684.720493] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] Acquiring lock "c2867798-9109-4f85-ae60-3830a711f21f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.720493] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] Lock "c2867798-9109-4f85-ae60-3830a711f21f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.720493] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] Lock "c2867798-9109-4f85-ae60-3830a711f21f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.720656] env[67270]: DEBUG nova.compute.manager [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] No waiting events found dispatching network-vif-plugged-c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 684.720769] env[67270]: WARNING nova.compute.manager [req-cc2817e5-b494-4629-9b91-017368ae6ede req-1f7942ff-835a-42fe-8fad-703f1bf9e330 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Received unexpected event network-vif-plugged-c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 for instance with vm_state building and task_state spawning. [ 684.797042] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b16eb10-828c-476b-bcf6-e55f3075e64b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.806202] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f1d8dd2-1812-4ebe-a139-a69b75c9a9c0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.850536] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a54a69b0-a586-40b7-9471-27f5cb6ee2f6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.861029] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fac8887-fff9-440c-9588-9f35d869f699 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.880502] env[67270]: DEBUG nova.compute.provider_tree [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 684.886169] env[67270]: DEBUG nova.scheduler.client.report [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 684.901141] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.901659] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 684.942853] env[67270]: DEBUG nova.compute.utils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 684.944381] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 684.944550] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 684.955093] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 685.041503] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 685.070643] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 685.070873] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 685.071292] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 685.071292] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 685.071443] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 685.071496] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 685.071712] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 685.071874] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 685.072053] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 685.072261] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 685.072458] env[67270]: DEBUG nova.virt.hardware [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 685.073408] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07c11ddb-8095-4907-bae7-3ca613d718bd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.082367] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b287d2-b7f0-4f58-8d24-7597c6a4ac98 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.203946] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Updating instance_info_cache with network_info: [{"id": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "address": "fa:16:3e:59:bd:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc63c7b1d-c7", "ovs_interfaceid": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 685.220719] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Releasing lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 685.221298] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Instance network_info: |[{"id": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "address": "fa:16:3e:59:bd:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc63c7b1d-c7", "ovs_interfaceid": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 685.221985] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:59:bd:e0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 685.234106] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Creating folder: Project (ee7ebb60622f48bb974035d46b75c62f). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.234106] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e2c030db-124c-4e51-8067-65e03f025882 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.249035] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Created folder: Project (ee7ebb60622f48bb974035d46b75c62f) in parent group-v814248. [ 685.249035] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Creating folder: Instances. Parent ref: group-v814252. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 685.249035] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ff8a984-96f8-453f-b59e-32652612aee4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.260576] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Created folder: Instances in parent group-v814252. [ 685.260838] env[67270]: DEBUG oslo.service.loopingcall [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 685.261047] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 685.261284] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-04d1af91-6f66-413c-8b92-633d98de8216 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.286108] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 685.286108] env[67270]: value = "task-4110553" [ 685.286108] env[67270]: _type = "Task" [ 685.286108] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 685.291944] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Successfully created port: 12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 685.298738] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110553, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 685.587035] env[67270]: DEBUG nova.policy [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8098d01ff18d4eb086f45ac8073bf93a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7056e3aa803c478f8163b5a9712e9388', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 685.802441] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110553, 'name': CreateVM_Task, 'duration_secs': 0.449921} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 685.802570] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 685.928057] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 685.928957] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 685.932019] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 685.932019] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-510bfa96-72e8-4973-8ed0-a71d07b23d0d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.936823] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Waiting for the task: (returnval){ [ 685.936823] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52bd3bdb-2138-045a-b9db-52cd7e823e4b" [ 685.936823] env[67270]: _type = "Task" [ 685.936823] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 685.949134] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52bd3bdb-2138-045a-b9db-52cd7e823e4b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 686.448754] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 686.449611] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 686.449985] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 686.616076] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Successfully created port: a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 686.622134] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Successfully created port: 39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 687.374844] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Successfully created port: 411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 688.625288] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Successfully created port: cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 688.891961] env[67270]: DEBUG nova.compute.manager [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Received event network-changed-c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 688.891961] env[67270]: DEBUG nova.compute.manager [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Refreshing instance network info cache due to event network-changed-c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 688.891961] env[67270]: DEBUG oslo_concurrency.lockutils [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] Acquiring lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 688.892093] env[67270]: DEBUG oslo_concurrency.lockutils [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] Acquired lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 688.893304] env[67270]: DEBUG nova.network.neutron [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Refreshing network info cache for port c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 689.770683] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.771100] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.771281] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 689.771407] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 689.793156] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.793333] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.793551] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.794174] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.794174] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.794174] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.794174] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 689.795278] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797053] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.797347] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 689.797347] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 689.823671] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.824172] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.824462] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.824720] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 689.826583] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075e48df-ed9d-47b2-a1fd-644b9afb20a8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.840728] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f64f634-f2ba-453f-98f7-c6827c3a2018 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.858766] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61e3d290-b823-487e-b576-84eb7c9d6a0f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.867419] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51e6686e-fe91-4fc4-8e15-16b34af9be2d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.907470] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180801MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 689.907470] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.907470] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.021041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c2867798-9109-4f85-ae60-3830a711f21f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 891481a1-edb6-4111-9779-23ba64d85dce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a51d9480-1aa1-48c9-a05c-943589d6a224 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021251] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 1e482ed7-9c9f-4713-abde-291417686a78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021251] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021308] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 690.021468] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 690.021608] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 690.166098] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f908500f-1e0a-4009-affc-24bcd2ac04e1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.177294] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73723080-1707-4a48-bfd7-289f4967359c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.217503] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de15ce0-0e9a-4e07-938a-fdae8dcf2a53 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.225721] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1821fea1-7704-4dcd-8ef2-88304a3001d9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.242247] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.251610] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.272603] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 690.272819] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.859791] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Successfully updated port: 39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 691.871747] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 691.872786] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquired lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 691.873064] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.068031] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Successfully updated port: 12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 692.088017] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.088185] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquired lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.088346] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.090218] env[67270]: DEBUG nova.network.neutron [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Updated VIF entry in instance network info cache for port c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 692.091110] env[67270]: DEBUG nova.network.neutron [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Updating instance_info_cache with network_info: [{"id": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "address": "fa:16:3e:59:bd:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.152", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc63c7b1d-c7", "ovs_interfaceid": "c63c7b1d-c7b9-4ac9-8cdb-be2c27f2b4d3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 692.104307] env[67270]: DEBUG oslo_concurrency.lockutils [req-8522cfd3-77ab-442b-95b3-ff05a45a3771 req-b469bf43-0a55-47e8-9a34-69425fb76b23 service nova] Releasing lock "refresh_cache-c2867798-9109-4f85-ae60-3830a711f21f" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 692.116024] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 692.236242] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 692.961125] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Successfully updated port: cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 692.981022] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.981022] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquired lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.981022] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 692.995167] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Successfully updated port: a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 693.016705] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.017281] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquired lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.017677] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 693.214549] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.219275] env[67270]: DEBUG nova.compute.manager [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Received event network-vif-plugged-39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 693.219544] env[67270]: DEBUG oslo_concurrency.lockutils [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] Acquiring lock "1e482ed7-9c9f-4713-abde-291417686a78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.219974] env[67270]: DEBUG oslo_concurrency.lockutils [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] Lock "1e482ed7-9c9f-4713-abde-291417686a78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.219974] env[67270]: DEBUG oslo_concurrency.lockutils [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] Lock "1e482ed7-9c9f-4713-abde-291417686a78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.220205] env[67270]: DEBUG nova.compute.manager [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] No waiting events found dispatching network-vif-plugged-39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 693.220317] env[67270]: WARNING nova.compute.manager [req-ece6f6a0-f6e0-481f-b66c-58b149c92438 req-58aec300-5295-4664-b498-cb877a1112b5 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Received unexpected event network-vif-plugged-39304fd9-a61c-46f8-9290-f36e8659225b for instance with vm_state building and task_state spawning. [ 693.225074] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.256482] env[67270]: DEBUG nova.compute.manager [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Received event network-vif-plugged-12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 693.256752] env[67270]: DEBUG oslo_concurrency.lockutils [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] Acquiring lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.258344] env[67270]: DEBUG oslo_concurrency.lockutils [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.258344] env[67270]: DEBUG oslo_concurrency.lockutils [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.258514] env[67270]: DEBUG nova.compute.manager [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] No waiting events found dispatching network-vif-plugged-12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 693.258793] env[67270]: WARNING nova.compute.manager [req-f6074f8f-60fd-4ca3-aa5f-01d9be01e4d2 req-92bf46bb-e995-419e-87fa-cf216e0d9b33 service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Received unexpected event network-vif-plugged-12812dd9-99bc-43d3-9c7a-082794d25e12 for instance with vm_state building and task_state spawning. [ 693.358891] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Updating instance_info_cache with network_info: [{"id": "39304fd9-a61c-46f8-9290-f36e8659225b", "address": "fa:16:3e:5b:8e:bf", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39304fd9-a6", "ovs_interfaceid": "39304fd9-a61c-46f8-9290-f36e8659225b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.385870] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Releasing lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.385982] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Instance network_info: |[{"id": "39304fd9-a61c-46f8-9290-f36e8659225b", "address": "fa:16:3e:5b:8e:bf", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39304fd9-a6", "ovs_interfaceid": "39304fd9-a61c-46f8-9290-f36e8659225b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 693.386702] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5b:8e:bf', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '39304fd9-a61c-46f8-9290-f36e8659225b', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 693.402086] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Creating folder: Project (ef24d5b59ad54caeac45b979a55750c2). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.402086] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2b807fc9-0518-4c25-a755-1bc068895a21 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.416761] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Created folder: Project (ef24d5b59ad54caeac45b979a55750c2) in parent group-v814248. [ 693.417080] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Creating folder: Instances. Parent ref: group-v814255. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.417292] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ded7d02b-a6d2-44f4-9a69-2a18be4923f0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.433112] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Created folder: Instances in parent group-v814255. [ 693.433388] env[67270]: DEBUG oslo.service.loopingcall [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 693.433691] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 693.433782] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ede9d39-5d27-4542-bac9-5b3d92e45a85 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.459289] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 693.459289] env[67270]: value = "task-4110556" [ 693.459289] env[67270]: _type = "Task" [ 693.459289] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.472087] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110556, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 693.709796] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Successfully updated port: 411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 693.722119] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.722250] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquired lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.722469] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 693.916295] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Updating instance_info_cache with network_info: [{"id": "12812dd9-99bc-43d3-9c7a-082794d25e12", "address": "fa:16:3e:86:71:f9", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12812dd9-99", "ovs_interfaceid": "12812dd9-99bc-43d3-9c7a-082794d25e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.917571] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 693.944036] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Releasing lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.944271] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance network_info: |[{"id": "12812dd9-99bc-43d3-9c7a-082794d25e12", "address": "fa:16:3e:86:71:f9", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12812dd9-99", "ovs_interfaceid": "12812dd9-99bc-43d3-9c7a-082794d25e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 693.944659] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:86:71:f9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '12812dd9-99bc-43d3-9c7a-082794d25e12', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 693.959025] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Creating folder: Project (c721a209cc994945ac297086b06cedd5). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.959025] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-024d6416-41f3-4202-b18a-d2e7287b3d77 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.977078] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110556, 'name': CreateVM_Task, 'duration_secs': 0.341509} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 693.977367] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 693.977423] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Created folder: Project (c721a209cc994945ac297086b06cedd5) in parent group-v814248. [ 693.977576] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Creating folder: Instances. Parent ref: group-v814258. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.978438] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.978624] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.978947] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 693.979246] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7564daf1-bdc2-4210-9d52-4cdbec5c7327 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.981087] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb710e97-b3e6-4368-883f-1d063ae98345 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.990101] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Waiting for the task: (returnval){ [ 693.990101] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52b17f86-8a3d-d401-f64c-92cf05884f06" [ 693.990101] env[67270]: _type = "Task" [ 693.990101] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.996709] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Created folder: Instances in parent group-v814258. [ 693.996881] env[67270]: DEBUG oslo.service.loopingcall [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 694.000979] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 694.002213] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52b17f86-8a3d-d401-f64c-92cf05884f06, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.002213] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-57ffb39a-76af-4ccb-80d1-eb1e90a217d2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.027238] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 694.027238] env[67270]: value = "task-4110559" [ 694.027238] env[67270]: _type = "Task" [ 694.027238] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.038889] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110559, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.336897] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Updating instance_info_cache with network_info: [{"id": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "address": "fa:16:3e:0b:d3:3c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa58ae5fc-57", "ovs_interfaceid": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.355403] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Releasing lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.355714] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance network_info: |[{"id": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "address": "fa:16:3e:0b:d3:3c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa58ae5fc-57", "ovs_interfaceid": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 694.356701] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0b:d3:3c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a58ae5fc-5754-4f64-8584-a21e10bd8a9a', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 694.367511] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Creating folder: Project (fbad7749517c40908ca9f1ac15dc514b). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 694.368197] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2f9d652-356c-4cdb-aedf-91a445a2bb91 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.380836] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Created folder: Project (fbad7749517c40908ca9f1ac15dc514b) in parent group-v814248. [ 694.382555] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Creating folder: Instances. Parent ref: group-v814261. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 694.383406] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45b076e0-8a26-4a17-ad76-ef4b2d4e2774 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.396805] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Created folder: Instances in parent group-v814261. [ 694.397126] env[67270]: DEBUG oslo.service.loopingcall [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 694.397428] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 694.397703] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-05627d76-2d2f-4ee8-bdb2-422ea61bbc5a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.424292] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 694.424292] env[67270]: value = "task-4110562" [ 694.424292] env[67270]: _type = "Task" [ 694.424292] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.433149] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110562, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.458167] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Updating instance_info_cache with network_info: [{"id": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "address": "fa:16:3e:ca:e5:bb", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd6ae397-f3", "ovs_interfaceid": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 694.472186] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Releasing lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.472530] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance network_info: |[{"id": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "address": "fa:16:3e:ca:e5:bb", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd6ae397-f3", "ovs_interfaceid": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 694.474144] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:e5:bb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cd6ae397-f33b-48ee-b806-8dba5b6642fd', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 694.486615] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Creating folder: Project (7056e3aa803c478f8163b5a9712e9388). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 694.487393] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52383e71-89f1-4753-b426-9f3b69131c22 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.510030] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.510030] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 694.510030] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 694.512088] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Created folder: Project (7056e3aa803c478f8163b5a9712e9388) in parent group-v814248. [ 694.512484] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Creating folder: Instances. Parent ref: group-v814264. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 694.512867] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c3c3cc6f-2252-4d6c-8017-046eebf95474 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.526174] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Created folder: Instances in parent group-v814264. [ 694.526174] env[67270]: DEBUG oslo.service.loopingcall [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 694.526174] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 694.526174] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-883c58e3-27d3-4444-9b48-3d7d7609e0a0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.559890] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110559, 'name': CreateVM_Task, 'duration_secs': 0.325124} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 694.559890] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 694.559890] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 694.559890] env[67270]: value = "task-4110565" [ 694.559890] env[67270]: _type = "Task" [ 694.559890] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.560439] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 694.560598] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 694.560910] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 694.565571] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1275c85f-37c4-45cf-93e9-6d807fe66a0f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.578215] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110565, 'name': CreateVM_Task} progress is 6%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.578643] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for the task: (returnval){ [ 694.578643] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52aa124b-898e-75f5-86ae-f8d0906a59fb" [ 694.578643] env[67270]: _type = "Task" [ 694.578643] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.597197] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52aa124b-898e-75f5-86ae-f8d0906a59fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.942825] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110562, 'name': CreateVM_Task, 'duration_secs': 0.348141} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 694.942956] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 694.943566] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.058304] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Updating instance_info_cache with network_info: [{"id": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "address": "fa:16:3e:5b:b8:8c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap411ab6ff-c0", "ovs_interfaceid": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 695.078170] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110565, 'name': CreateVM_Task} progress is 99%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.079558] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Releasing lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 695.080031] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance network_info: |[{"id": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "address": "fa:16:3e:5b:b8:8c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap411ab6ff-c0", "ovs_interfaceid": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 695.083682] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5b:b8:8c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '411ab6ff-c0ec-4478-88c9-9620271f45b3', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 695.091192] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Creating folder: Project (f397f9a74958467bbb12b0dea4e7ef2e). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 695.096486] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f47c4657-ed7c-4a0c-a1f8-7367a05043a2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.105882] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 695.106123] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 695.106340] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.106839] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 695.106839] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 695.107104] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e452e49-bda7-498b-b386-423e5286a9d6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.112736] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for the task: (returnval){ [ 695.112736] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]527592de-976e-66bd-2b88-5a61a4c4ef31" [ 695.112736] env[67270]: _type = "Task" [ 695.112736] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 695.117482] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Created folder: Project (f397f9a74958467bbb12b0dea4e7ef2e) in parent group-v814248. [ 695.117677] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Creating folder: Instances. Parent ref: group-v814267. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 695.118300] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0d54e28-2284-4354-bbe7-1aff8e87312c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.125482] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]527592de-976e-66bd-2b88-5a61a4c4ef31, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.137018] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Created folder: Instances in parent group-v814267. [ 695.137018] env[67270]: DEBUG oslo.service.loopingcall [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 695.137018] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 695.137018] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cc74e6d9-68e0-449a-ac57-572363a57181 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.162435] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 695.162435] env[67270]: value = "task-4110568" [ 695.162435] env[67270]: _type = "Task" [ 695.162435] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 695.172718] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110568, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.580867] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110565, 'name': CreateVM_Task} progress is 99%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.624993] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 695.625165] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 695.625379] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.632836] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.633120] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.645846] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 695.676877] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110568, 'name': CreateVM_Task, 'duration_secs': 0.378023} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 695.677028] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 695.677893] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.677893] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 695.678289] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 695.678830] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cfdb1911-d899-45ee-a2ca-480c1c592b69 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.685072] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for the task: (returnval){ [ 695.685072] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52c6d00c-0bf7-fb48-7fcc-541d3b0695af" [ 695.685072] env[67270]: _type = "Task" [ 695.685072] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 695.702534] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 695.702784] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 695.702988] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.709819] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.710079] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.711623] env[67270]: INFO nova.compute.claims [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 696.011521] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c430da-0d09-4dcf-b69b-0e9eb1a13976 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.022955] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa5c91b4-d434-4489-aed3-621982e6f3f5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.061581] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d602038-6538-4bc6-b54e-e244503a4fe2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.075828] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110565, 'name': CreateVM_Task} progress is 99%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 696.077167] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe9ecfd-9a4a-4785-8232-6dd8f691cc84 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.095984] env[67270]: DEBUG nova.compute.provider_tree [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.106176] env[67270]: DEBUG nova.scheduler.client.report [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.123603] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.125059] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 696.171390] env[67270]: DEBUG nova.compute.utils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 696.173843] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 696.175483] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 696.183156] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 696.272791] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 696.313237] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 696.314556] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 696.314924] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 696.317160] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 696.317160] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 696.317160] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 696.317160] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 696.317160] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 696.317387] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 696.317492] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 696.317636] env[67270]: DEBUG nova.virt.hardware [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 696.321293] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ce94d98-8639-452d-b002-7a025435ebe8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.329137] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdd62a42-15f6-4a9b-8b15-e37de0a6076c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.427313] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "eff1fe32-1755-4536-9ad9-286e1392a08d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.427963] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "eff1fe32-1755-4536-9ad9-286e1392a08d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.443737] env[67270]: DEBUG nova.policy [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1f6c94c88f88410f85d4253ea3652e30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ce242258f724d98b81e0ca098bbab6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 696.446416] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 696.525038] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.525310] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.526756] env[67270]: INFO nova.compute.claims [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 696.576990] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110565, 'name': CreateVM_Task, 'duration_secs': 1.964381} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 696.583050] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 696.583769] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 696.583980] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 696.584343] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 696.584934] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e1f41f2-4432-448b-9c96-851e8844445e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.592134] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for the task: (returnval){ [ 696.592134] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52f03912-bb62-a2dd-536a-5c3e025f78ff" [ 696.592134] env[67270]: _type = "Task" [ 696.592134] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 696.605241] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52f03912-bb62-a2dd-536a-5c3e025f78ff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 696.786549] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbaa25b2-8369-474d-8e9a-f65c9dca4d15 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.799873] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-637fb196-24d1-4c9c-a91b-dbeb79cb1e2e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.835956] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bab17c7a-e21e-4778-939c-ff6f7b1ae9ae {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.845586] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de3e9bdd-3e05-4146-a761-c25d0a9b6d96 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.861109] env[67270]: DEBUG nova.compute.provider_tree [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.875738] env[67270]: DEBUG nova.scheduler.client.report [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.895424] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.370s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.895970] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 696.939083] env[67270]: DEBUG nova.compute.utils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 696.941467] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 696.941732] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 696.961277] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 697.055624] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 697.090037] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 697.090338] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 697.090393] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 697.090565] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 697.090710] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 697.090858] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 697.094577] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 697.096663] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 697.096782] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 697.096961] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 697.097158] env[67270]: DEBUG nova.virt.hardware [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 697.101883] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bb83fb4-f8f8-49bf-98d0-dea978101a95 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.122624] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.124541] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.130113] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0501246e-1d20-4e38-95f2-fc72041b66c2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.138973] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 697.139291] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 697.139502] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 697.150232] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 697.168105] env[67270]: DEBUG nova.policy [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be49f769c1634f3a87291c5ab54ff57b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c463d87274e1465cb56e100993537ed6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 697.213498] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.214092] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.215351] env[67270]: INFO nova.compute.claims [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 697.488344] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db45aaf-a3db-4d0b-8fa7-f1f12799b566 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.503677] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3f0f16-9934-4e45-9029-5577e41eb988 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.540566] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2733aec0-1deb-4098-9ff4-d2f60a9514cb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.550312] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c304e612-3da6-45bf-be41-0e2d4ce07a70 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.577143] env[67270]: DEBUG nova.compute.provider_tree [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.594086] env[67270]: DEBUG nova.scheduler.client.report [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.598877] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Successfully created port: c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 697.632080] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.632158] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 697.687154] env[67270]: DEBUG nova.compute.utils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 697.689316] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 697.689492] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 697.709625] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 697.809861] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 697.842559] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 697.842835] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 697.842901] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 697.843505] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 697.843729] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 697.843881] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 697.844121] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 697.844301] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 697.844444] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 697.844600] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 697.844762] env[67270]: DEBUG nova.virt.hardware [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 697.845808] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c6bfd2-d41c-4ba2-8b4e-cdc6495c2ae3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.858267] env[67270]: DEBUG nova.policy [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77a49c78f83a459b83ad2aba656428b4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32cfd83297b44cc3a2ab44b3a4d9b5c5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 697.861039] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cf6c922-d993-4d4c-9185-9efe6f578fbf {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.230775] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Received event network-vif-plugged-a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 698.231063] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquiring lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.231693] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.231693] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.231693] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] No waiting events found dispatching network-vif-plugged-a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 698.231812] env[67270]: WARNING nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Received unexpected event network-vif-plugged-a58ae5fc-5754-4f64-8584-a21e10bd8a9a for instance with vm_state building and task_state spawning. [ 698.231928] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Received event network-changed-12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 698.232097] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Refreshing instance network info cache due to event network-changed-12812dd9-99bc-43d3-9c7a-082794d25e12. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 698.232279] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquiring lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 698.232416] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquired lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 698.232615] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Refreshing network info cache for port 12812dd9-99bc-43d3-9c7a-082794d25e12 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 698.251015] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Successfully created port: 87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 698.394521] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Received event network-changed-39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 698.396223] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Refreshing instance network info cache due to event network-changed-39304fd9-a61c-46f8-9290-f36e8659225b. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 698.396699] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Acquiring lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 698.397074] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Acquired lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 698.398823] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Refreshing network info cache for port 39304fd9-a61c-46f8-9290-f36e8659225b {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 698.524156] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.524156] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.482332] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Updated VIF entry in instance network info cache for port 12812dd9-99bc-43d3-9c7a-082794d25e12. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 699.482693] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Updating instance_info_cache with network_info: [{"id": "12812dd9-99bc-43d3-9c7a-082794d25e12", "address": "fa:16:3e:86:71:f9", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.38", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap12812dd9-99", "ovs_interfaceid": "12812dd9-99bc-43d3-9c7a-082794d25e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.493398] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Releasing lock "refresh_cache-a51d9480-1aa1-48c9-a05c-943589d6a224" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 699.493644] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Received event network-vif-plugged-411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 699.493839] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquiring lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.494051] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.494210] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.494478] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] No waiting events found dispatching network-vif-plugged-411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 699.494614] env[67270]: WARNING nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Received unexpected event network-vif-plugged-411ab6ff-c0ec-4478-88c9-9620271f45b3 for instance with vm_state building and task_state spawning. [ 699.494782] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Received event network-changed-a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 699.494943] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Refreshing instance network info cache due to event network-changed-a58ae5fc-5754-4f64-8584-a21e10bd8a9a. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 699.495147] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquiring lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 699.495289] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquired lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 699.495444] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Refreshing network info cache for port a58ae5fc-5754-4f64-8584-a21e10bd8a9a {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 699.815087] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Successfully created port: 970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 699.991590] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Updated VIF entry in instance network info cache for port 39304fd9-a61c-46f8-9290-f36e8659225b. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 699.991957] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Updating instance_info_cache with network_info: [{"id": "39304fd9-a61c-46f8-9290-f36e8659225b", "address": "fa:16:3e:5b:8e:bf", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap39304fd9-a6", "ovs_interfaceid": "39304fd9-a61c-46f8-9290-f36e8659225b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.001898] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Releasing lock "refresh_cache-1e482ed7-9c9f-4713-abde-291417686a78" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.002163] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Received event network-vif-plugged-cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.002357] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Acquiring lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.002561] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.002724] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.002887] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] No waiting events found dispatching network-vif-plugged-cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 700.003070] env[67270]: WARNING nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Received unexpected event network-vif-plugged-cd6ae397-f33b-48ee-b806-8dba5b6642fd for instance with vm_state building and task_state spawning. [ 700.003238] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Received event network-changed-cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.003392] env[67270]: DEBUG nova.compute.manager [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Refreshing instance network info cache due to event network-changed-cd6ae397-f33b-48ee-b806-8dba5b6642fd. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 700.003571] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Acquiring lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.003753] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Acquired lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.003878] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Refreshing network info cache for port cd6ae397-f33b-48ee-b806-8dba5b6642fd {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 700.526071] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Successfully updated port: c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 700.545178] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.545178] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquired lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.545178] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 700.869927] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 700.995408] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Updated VIF entry in instance network info cache for port a58ae5fc-5754-4f64-8584-a21e10bd8a9a. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 700.995408] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Updating instance_info_cache with network_info: [{"id": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "address": "fa:16:3e:0b:d3:3c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa58ae5fc-57", "ovs_interfaceid": "a58ae5fc-5754-4f64-8584-a21e10bd8a9a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.008514] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Releasing lock "refresh_cache-4a086288-b773-40aa-b39a-e3f3b9784a05" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.008934] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Received event network-changed-411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 701.009373] env[67270]: DEBUG nova.compute.manager [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Refreshing instance network info cache due to event network-changed-411ab6ff-c0ec-4478-88c9-9620271f45b3. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 701.009736] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquiring lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.010031] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Acquired lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.011403] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Refreshing network info cache for port 411ab6ff-c0ec-4478-88c9-9620271f45b3 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 701.346481] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Successfully updated port: 87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 701.355179] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.355315] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquired lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.356012] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 701.468603] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.543492] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Updated VIF entry in instance network info cache for port cd6ae397-f33b-48ee-b806-8dba5b6642fd. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 701.543756] env[67270]: DEBUG nova.network.neutron [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Updating instance_info_cache with network_info: [{"id": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "address": "fa:16:3e:ca:e5:bb", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcd6ae397-f3", "ovs_interfaceid": "cd6ae397-f33b-48ee-b806-8dba5b6642fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.559151] env[67270]: DEBUG oslo_concurrency.lockutils [req-b69f4fa9-4800-4d36-bd1d-e515f3c88dac req-c27f0a82-c4de-402d-94bf-5f36161edb20 service nova] Releasing lock "refresh_cache-379f5a6d-d6d4-434a-b401-1b027434e6fd" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.590393] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Updating instance_info_cache with network_info: [{"id": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "address": "fa:16:3e:a2:b0:fa", "network": {"id": "de705c96-225a-4784-adb8-b2e9401f463f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1091423232-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ce242258f724d98b81e0ca098bbab6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1e9079a-50", "ovs_interfaceid": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.618039] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Releasing lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.618355] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance network_info: |[{"id": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "address": "fa:16:3e:a2:b0:fa", "network": {"id": "de705c96-225a-4784-adb8-b2e9401f463f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1091423232-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ce242258f724d98b81e0ca098bbab6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1e9079a-50", "ovs_interfaceid": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 701.625769] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a2:b0:fa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19598cc1-e105-4565-906a-09dde75e3fbe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c1e9079a-5010-492a-ae4a-e02f3122c92f', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 701.636014] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Creating folder: Project (7ce242258f724d98b81e0ca098bbab6a). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.639434] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce0b6486-994f-41c5-b369-2646124b9073 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.649854] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Created folder: Project (7ce242258f724d98b81e0ca098bbab6a) in parent group-v814248. [ 701.650242] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Creating folder: Instances. Parent ref: group-v814270. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 701.650358] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-560b091f-96b4-4a67-984d-2463cf873c63 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.663532] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Created folder: Instances in parent group-v814270. [ 701.663532] env[67270]: DEBUG oslo.service.loopingcall [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 701.663532] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 701.663532] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6d812996-36e5-41d6-b665-e25896f14190 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.687007] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 701.687007] env[67270]: value = "task-4110571" [ 701.687007] env[67270]: _type = "Task" [ 701.687007] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 701.699462] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110571, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 702.179876] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Updated VIF entry in instance network info cache for port 411ab6ff-c0ec-4478-88c9-9620271f45b3. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 702.179876] env[67270]: DEBUG nova.network.neutron [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Updating instance_info_cache with network_info: [{"id": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "address": "fa:16:3e:5b:b8:8c", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.245", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap411ab6ff-c0", "ovs_interfaceid": "411ab6ff-c0ec-4478-88c9-9620271f45b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.195946] env[67270]: DEBUG oslo_concurrency.lockutils [req-ac8d2f3e-042c-4ac6-a1da-7b386640f08f req-caa0b9b9-e4b4-4307-bfa8-ffc98e47c44d service nova] Releasing lock "refresh_cache-c847f4cb-1914-497b-8d63-5b99a237e5e6" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.204405] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110571, 'name': CreateVM_Task, 'duration_secs': 0.322352} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 702.204405] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 702.205431] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.205648] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.206036] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 702.206350] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cfa20bcd-2d70-4f59-969a-d30ba9202d21 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.213258] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for the task: (returnval){ [ 702.213258] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52412642-dc41-7162-8d5b-2706ad21a397" [ 702.213258] env[67270]: _type = "Task" [ 702.213258] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 702.223598] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52412642-dc41-7162-8d5b-2706ad21a397, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 702.278785] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Updating instance_info_cache with network_info: [{"id": "87b5465f-ed96-4b26-9631-39904b49bd35", "address": "fa:16:3e:18:15:ee", "network": {"id": "c6986037-0932-4904-8753-309f357cbbee", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-807174541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c463d87274e1465cb56e100993537ed6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87b5465f-ed", "ovs_interfaceid": "87b5465f-ed96-4b26-9631-39904b49bd35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.294766] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Releasing lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.295090] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance network_info: |[{"id": "87b5465f-ed96-4b26-9631-39904b49bd35", "address": "fa:16:3e:18:15:ee", "network": {"id": "c6986037-0932-4904-8753-309f357cbbee", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-807174541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c463d87274e1465cb56e100993537ed6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87b5465f-ed", "ovs_interfaceid": "87b5465f-ed96-4b26-9631-39904b49bd35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 702.295471] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:18:15:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'edd47158-6f4b-44a1-8e82-0411205ad299', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '87b5465f-ed96-4b26-9631-39904b49bd35', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 702.304367] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Creating folder: Project (c463d87274e1465cb56e100993537ed6). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 702.304946] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-528dc4ca-35ff-498b-9365-29b780fe36ea {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.318027] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Created folder: Project (c463d87274e1465cb56e100993537ed6) in parent group-v814248. [ 702.318027] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Creating folder: Instances. Parent ref: group-v814273. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 702.318027] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-81cd2e21-00d8-495a-bd8a-44759c8a01d7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.329699] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Created folder: Instances in parent group-v814273. [ 702.331610] env[67270]: DEBUG oslo.service.loopingcall [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 702.331610] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 702.331610] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-37a256e1-1aa2-4730-a3ee-bf66ded8c664 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.360180] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 702.360180] env[67270]: value = "task-4110574" [ 702.360180] env[67270]: _type = "Task" [ 702.360180] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 702.370772] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110574, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 702.729637] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.729637] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 702.729637] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.736181] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Successfully updated port: 970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 702.747212] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.747212] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquired lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.747409] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.878148] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110574, 'name': CreateVM_Task, 'duration_secs': 0.351754} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 702.878428] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 702.880725] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.880725] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.880725] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 702.880725] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c94e13f0-fff9-404d-992c-73aee7e7cbb4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.887926] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Waiting for the task: (returnval){ [ 702.887926] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52055d92-ad42-c3d7-5859-a84f10d9e451" [ 702.887926] env[67270]: _type = "Task" [ 702.887926] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 702.900797] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52055d92-ad42-c3d7-5859-a84f10d9e451, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 703.076764] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.410656] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.410995] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 703.412030] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.606494] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Updating instance_info_cache with network_info: [{"id": "970ebb87-755a-4bd3-9420-771af476ee80", "address": "fa:16:3e:e8:e1:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap970ebb87-75", "ovs_interfaceid": "970ebb87-755a-4bd3-9420-771af476ee80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.627699] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Releasing lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.627699] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance network_info: |[{"id": "970ebb87-755a-4bd3-9420-771af476ee80", "address": "fa:16:3e:e8:e1:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap970ebb87-75", "ovs_interfaceid": "970ebb87-755a-4bd3-9420-771af476ee80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 703.628317] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:e1:e0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dc16c915-cff1-4faa-a529-9773ee9bab7e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '970ebb87-755a-4bd3-9420-771af476ee80', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 703.636985] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Creating folder: Project (32cfd83297b44cc3a2ab44b3a4d9b5c5). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 703.637603] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6112105-0306-4b5a-809c-ac97e06ae0ff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.655656] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Created folder: Project (32cfd83297b44cc3a2ab44b3a4d9b5c5) in parent group-v814248. [ 703.655656] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Creating folder: Instances. Parent ref: group-v814276. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 703.656038] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2325178f-286d-4bed-a7f0-fd84797dc1d7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.673267] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Created folder: Instances in parent group-v814276. [ 703.673358] env[67270]: DEBUG oslo.service.loopingcall [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 703.676032] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 703.676032] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b8852408-b7da-4168-b18e-8f80694f87d4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.696906] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 703.696906] env[67270]: value = "task-4110577" [ 703.696906] env[67270]: _type = "Task" [ 703.696906] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.711287] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110577, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.062656] env[67270]: DEBUG nova.compute.manager [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Received event network-vif-plugged-c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 704.062910] env[67270]: DEBUG oslo_concurrency.lockutils [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] Acquiring lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.063074] env[67270]: DEBUG oslo_concurrency.lockutils [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.063235] env[67270]: DEBUG oslo_concurrency.lockutils [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.066392] env[67270]: DEBUG nova.compute.manager [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] No waiting events found dispatching network-vif-plugged-c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 704.066524] env[67270]: WARNING nova.compute.manager [req-7103a75d-be22-49c8-8a6e-4518d0a10f44 req-9ba67bf6-49c9-4c5c-ba23-bfce4c2d0d54 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Received unexpected event network-vif-plugged-c1e9079a-5010-492a-ae4a-e02f3122c92f for instance with vm_state building and task_state spawning. [ 704.208146] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110577, 'name': CreateVM_Task, 'duration_secs': 0.329033} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 704.208327] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 704.209019] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.209261] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.209879] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 704.210275] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-97ab404a-5eff-4264-873f-3462854721e9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.218738] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for the task: (returnval){ [ 704.218738] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52ddcc5f-bd2d-706b-6e25-272ede697b70" [ 704.218738] env[67270]: _type = "Task" [ 704.218738] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 704.232139] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52ddcc5f-bd2d-706b-6e25-272ede697b70, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.419516] env[67270]: DEBUG nova.compute.manager [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Received event network-vif-plugged-970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 704.419741] env[67270]: DEBUG oslo_concurrency.lockutils [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] Acquiring lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.419943] env[67270]: DEBUG oslo_concurrency.lockutils [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.420208] env[67270]: DEBUG oslo_concurrency.lockutils [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.420307] env[67270]: DEBUG nova.compute.manager [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] No waiting events found dispatching network-vif-plugged-970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 704.420462] env[67270]: WARNING nova.compute.manager [req-4d8023fc-1237-4f8c-9a27-95611cf5f199 req-43d87b34-fbe1-4756-aee6-33b6c98cabe5 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Received unexpected event network-vif-plugged-970ebb87-755a-4bd3-9420-771af476ee80 for instance with vm_state building and task_state spawning. [ 704.735439] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.735770] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 704.735995] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.422167] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Received event network-changed-c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 707.422167] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Refreshing instance network info cache due to event network-changed-c1e9079a-5010-492a-ae4a-e02f3122c92f. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 707.422167] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Acquiring lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.422167] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Acquired lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.422821] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Refreshing network info cache for port c1e9079a-5010-492a-ae4a-e02f3122c92f {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 707.934084] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Updated VIF entry in instance network info cache for port c1e9079a-5010-492a-ae4a-e02f3122c92f. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 707.934509] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Updating instance_info_cache with network_info: [{"id": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "address": "fa:16:3e:a2:b0:fa", "network": {"id": "de705c96-225a-4784-adb8-b2e9401f463f", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1091423232-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ce242258f724d98b81e0ca098bbab6a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19598cc1-e105-4565-906a-09dde75e3fbe", "external-id": "nsx-vlan-transportzone-371", "segmentation_id": 371, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1e9079a-50", "ovs_interfaceid": "c1e9079a-5010-492a-ae4a-e02f3122c92f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.954027] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Releasing lock "refresh_cache-a073c7a9-d7ee-4d9e-be23-4345ed5f9047" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.954283] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Received event network-vif-plugged-87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 707.954450] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Acquiring lock "eff1fe32-1755-4536-9ad9-286e1392a08d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.954663] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Lock "eff1fe32-1755-4536-9ad9-286e1392a08d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.954825] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Lock "eff1fe32-1755-4536-9ad9-286e1392a08d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.954990] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] No waiting events found dispatching network-vif-plugged-87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 707.955178] env[67270]: WARNING nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Received unexpected event network-vif-plugged-87b5465f-ed96-4b26-9631-39904b49bd35 for instance with vm_state building and task_state spawning. [ 707.955352] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Received event network-changed-87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 707.955511] env[67270]: DEBUG nova.compute.manager [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Refreshing instance network info cache due to event network-changed-87b5465f-ed96-4b26-9631-39904b49bd35. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 707.955693] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Acquiring lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.955827] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Acquired lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.959022] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Refreshing network info cache for port 87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.433134] env[67270]: DEBUG nova.compute.manager [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Received event network-changed-970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 708.433134] env[67270]: DEBUG nova.compute.manager [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Refreshing instance network info cache due to event network-changed-970ebb87-755a-4bd3-9420-771af476ee80. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 708.433134] env[67270]: DEBUG oslo_concurrency.lockutils [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] Acquiring lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.433134] env[67270]: DEBUG oslo_concurrency.lockutils [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] Acquired lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.433134] env[67270]: DEBUG nova.network.neutron [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Refreshing network info cache for port 970ebb87-755a-4bd3-9420-771af476ee80 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.584786] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Updated VIF entry in instance network info cache for port 87b5465f-ed96-4b26-9631-39904b49bd35. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 708.588415] env[67270]: DEBUG nova.network.neutron [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Updating instance_info_cache with network_info: [{"id": "87b5465f-ed96-4b26-9631-39904b49bd35", "address": "fa:16:3e:18:15:ee", "network": {"id": "c6986037-0932-4904-8753-309f357cbbee", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-807174541-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c463d87274e1465cb56e100993537ed6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "edd47158-6f4b-44a1-8e82-0411205ad299", "external-id": "nsx-vlan-transportzone-587", "segmentation_id": 587, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87b5465f-ed", "ovs_interfaceid": "87b5465f-ed96-4b26-9631-39904b49bd35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.599185] env[67270]: DEBUG oslo_concurrency.lockutils [req-4a7e1427-179d-49ca-82e5-28d0e137fab4 req-7e9d77ee-09ee-4c7a-bfec-55e30ff07219 service nova] Releasing lock "refresh_cache-eff1fe32-1755-4536-9ad9-286e1392a08d" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.111291] env[67270]: DEBUG nova.network.neutron [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Updated VIF entry in instance network info cache for port 970ebb87-755a-4bd3-9420-771af476ee80. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 709.111291] env[67270]: DEBUG nova.network.neutron [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Updating instance_info_cache with network_info: [{"id": "970ebb87-755a-4bd3-9420-771af476ee80", "address": "fa:16:3e:e8:e1:e0", "network": {"id": "a164bec6-77e2-44d1-8780-61c9b6046405", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "b4cc8d13a7354de8be4a029915d283ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dc16c915-cff1-4faa-a529-9773ee9bab7e", "external-id": "nsx-vlan-transportzone-93", "segmentation_id": 93, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap970ebb87-75", "ovs_interfaceid": "970ebb87-755a-4bd3-9420-771af476ee80", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.125339] env[67270]: DEBUG oslo_concurrency.lockutils [req-ee53a475-f0ff-410a-9f20-6f82fcf7bcbe req-a47cd187-f927-4143-8ca8-8bb5f7223917 service nova] Releasing lock "refresh_cache-5d61c322-6a7d-4991-8cc4-6dcb1be74256" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.434563] env[67270]: DEBUG nova.compute.manager [req-26426811-c9e4-4c95-bd31-677f687edb18 req-3d2596c5-8a4a-4399-bcdb-90c3120d7bdc service nova] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Received event network-vif-deleted-87b5465f-ed96-4b26-9631-39904b49bd35 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 732.931983] env[67270]: WARNING oslo_vmware.rw_handles [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 732.931983] env[67270]: ERROR oslo_vmware.rw_handles [ 732.932720] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 732.935325] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 732.935642] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Copying Virtual Disk [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/1d76118f-eb84-479c-828e-6064be110616/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 732.936203] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7ad87952-e806-4f5f-b749-f8c956bb12fd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.947396] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for the task: (returnval){ [ 732.947396] env[67270]: value = "task-4110578" [ 732.947396] env[67270]: _type = "Task" [ 732.947396] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 732.958099] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Task: {'id': task-4110578, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 733.463031] env[67270]: DEBUG oslo_vmware.exceptions [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 733.463680] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.467884] env[67270]: ERROR nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 733.467884] env[67270]: Faults: ['InvalidArgument'] [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Traceback (most recent call last): [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] yield resources [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self.driver.spawn(context, instance, image_meta, [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self._fetch_image_if_missing(context, vi) [ 733.467884] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] image_cache(vi, tmp_image_ds_loc) [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] vm_util.copy_virtual_disk( [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] session._wait_for_task(vmdk_copy_task) [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return self.wait_for_task(task_ref) [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return evt.wait() [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] result = hub.switch() [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.468297] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return self.greenlet.switch() [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self.f(*self.args, **self.kw) [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] raise exceptions.translate_fault(task_info.error) [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Faults: ['InvalidArgument'] [ 733.468697] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] [ 733.468697] env[67270]: INFO nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Terminating instance [ 733.469487] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.469757] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 733.470320] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.470466] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquired lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.470704] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.471594] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9acea014-30dc-40ff-921a-c15b2da6a9a8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.484087] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 733.484216] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 733.485237] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c49e278-7b09-4c10-bb43-33fc38e6f546 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.494660] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Waiting for the task: (returnval){ [ 733.494660] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52871c69-4e15-b88d-4179-e629a56dc365" [ 733.494660] env[67270]: _type = "Task" [ 733.494660] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 733.506833] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 733.507093] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Creating directory with path [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 733.507321] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2893c056-6aaa-4df3-8313-5cefe17c17d0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.518038] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 733.521854] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Created directory with path [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 733.522096] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Fetch image to [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 733.522277] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 733.523091] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc1a737-f1db-45a6-b979-4d760128aba0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.531838] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad4c0416-600d-4f0c-96ec-c3308d1bbcdc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.543694] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c10c8c8-aa2c-4955-8f34-85da7e1ad7c0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.583239] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f734fab-e0f1-43d6-9c17-31bd88550c57 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.591090] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f3677518-59ac-41d6-a672-ad638da25737 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.633762] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 733.710901] env[67270]: DEBUG oslo_vmware.rw_handles [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 733.774269] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 733.781073] env[67270]: DEBUG oslo_vmware.rw_handles [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 733.781073] env[67270]: DEBUG oslo_vmware.rw_handles [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 733.795410] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Releasing lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.796188] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 733.796405] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 733.797936] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad8fda11-1fb6-432b-ab8f-eafd5f0d891e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.808801] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 733.808801] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-40be1047-2b46-464a-9c77-d0452d38ab23 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.842992] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 733.843272] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 733.843443] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Deleting the datastore file [datastore1] 891481a1-edb6-4111-9779-23ba64d85dce {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 733.843705] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c862b58c-e9ce-482a-85ab-8e3cee88ffcd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.854526] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for the task: (returnval){ [ 733.854526] env[67270]: value = "task-4110580" [ 733.854526] env[67270]: _type = "Task" [ 733.854526] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 733.868708] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Task: {'id': task-4110580, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 734.365008] env[67270]: DEBUG oslo_vmware.api [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Task: {'id': task-4110580, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040215} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 734.365008] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 734.365341] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 734.365341] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 734.365669] env[67270]: INFO nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Took 0.57 seconds to destroy the instance on the hypervisor. [ 734.365917] env[67270]: DEBUG oslo.service.loopingcall [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 734.366135] env[67270]: DEBUG nova.compute.manager [-] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 734.369352] env[67270]: DEBUG nova.compute.claims [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 734.369547] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.369773] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.557210] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f522e0b-15c5-4064-9d09-7e5c973d1dd2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.571085] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51a0b6b3-6d11-4964-8dda-79f601d88588 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.606576] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba6faf65-e2ad-4459-9951-6aa56fc9d70d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.614854] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43153a60-d188-4aa1-8678-fc8d1006f2df {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.629015] env[67270]: DEBUG nova.compute.provider_tree [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 734.639679] env[67270]: DEBUG nova.scheduler.client.report [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 734.655073] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.655391] env[67270]: ERROR nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 734.655391] env[67270]: Faults: ['InvalidArgument'] [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Traceback (most recent call last): [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self.driver.spawn(context, instance, image_meta, [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self._fetch_image_if_missing(context, vi) [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] image_cache(vi, tmp_image_ds_loc) [ 734.655391] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] vm_util.copy_virtual_disk( [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] session._wait_for_task(vmdk_copy_task) [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return self.wait_for_task(task_ref) [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return evt.wait() [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] result = hub.switch() [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] return self.greenlet.switch() [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 734.656889] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] self.f(*self.args, **self.kw) [ 734.657771] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 734.657771] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] raise exceptions.translate_fault(task_info.error) [ 734.657771] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 734.657771] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Faults: ['InvalidArgument'] [ 734.657771] env[67270]: ERROR nova.compute.manager [instance: 891481a1-edb6-4111-9779-23ba64d85dce] [ 734.657771] env[67270]: DEBUG nova.compute.utils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 734.662635] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Build of instance 891481a1-edb6-4111-9779-23ba64d85dce was re-scheduled: A specified parameter was not correct: fileType [ 734.662635] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 734.663074] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 734.663309] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquiring lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.663464] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Acquired lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.663625] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.703113] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.852721] env[67270]: DEBUG nova.network.neutron [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.869926] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Releasing lock "refresh_cache-891481a1-edb6-4111-9779-23ba64d85dce" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.870316] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 734.870518] env[67270]: DEBUG nova.compute.manager [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] [instance: 891481a1-edb6-4111-9779-23ba64d85dce] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 735.034654] env[67270]: INFO nova.scheduler.client.report [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Deleted allocations for instance 891481a1-edb6-4111-9779-23ba64d85dce [ 735.059491] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c792b0bf-6a4e-4aaa-85d3-32042d7a48dd tempest-ServersAdmin275Test-1742033693 tempest-ServersAdmin275Test-1742033693-project-member] Lock "891481a1-edb6-4111-9779-23ba64d85dce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 55.438s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.110971] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 735.185224] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.185427] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.186994] env[67270]: INFO nova.compute.claims [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 735.466579] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77ec1190-3b7e-43d8-be40-8575002d924b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.476906] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76490c94-ba6c-4fd8-a707-565e1b8e7b4b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.524815] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e424f0aa-6da7-4944-a3aa-0ca0710c2332 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.536949] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-442321b5-4326-4098-8e80-5534c0625d6b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.553589] env[67270]: DEBUG nova.compute.provider_tree [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.566060] env[67270]: DEBUG nova.scheduler.client.report [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.584429] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.584429] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 735.621799] env[67270]: DEBUG nova.compute.utils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 735.623264] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Not allocating networking since 'none' was specified. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 735.632456] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 735.717255] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 735.743125] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 735.743409] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 735.743566] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 735.743741] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 735.743993] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 735.744183] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 735.744398] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 735.744584] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 735.744759] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 735.744918] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 735.745097] env[67270]: DEBUG nova.virt.hardware [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 735.746489] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6b06fc-362b-4663-a227-b437fc2acc91 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.757963] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13fecdab-d2a9-4783-bcbf-70777f0b877a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.778341] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance VIF info [] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 735.784271] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Creating folder: Project (55a1b56a7e1347b6b3f985c15c5cc895). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 735.785460] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fa5b0a1-f346-4966-a363-97ce3a98d993 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.797344] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Created folder: Project (55a1b56a7e1347b6b3f985c15c5cc895) in parent group-v814248. [ 735.797545] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Creating folder: Instances. Parent ref: group-v814279. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 735.797787] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-809b65b9-0f6b-41b0-92c4-9af6eb00a0e0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.811442] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Created folder: Instances in parent group-v814279. [ 735.815019] env[67270]: DEBUG oslo.service.loopingcall [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.815019] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 735.815019] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b5c0a074-4247-422c-81d6-3f416b3d10dc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.831293] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 735.831293] env[67270]: value = "task-4110583" [ 735.831293] env[67270]: _type = "Task" [ 735.831293] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 735.840131] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110583, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 736.342729] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110583, 'name': CreateVM_Task, 'duration_secs': 0.264614} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 736.342906] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 736.343329] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.343490] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.343814] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 736.344087] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-93e0718f-d195-4d8a-8db7-a5ba03bed984 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.349637] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for the task: (returnval){ [ 736.349637] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52e3197a-b190-c90f-22f7-c21e27d24c31" [ 736.349637] env[67270]: _type = "Task" [ 736.349637] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 736.360537] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52e3197a-b190-c90f-22f7-c21e27d24c31, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 736.874482] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.874482] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 736.874482] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.254455] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.291153] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.291153] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.291153] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 750.759656] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.759656] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.759656] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 750.759656] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 750.783267] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.783267] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.783267] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785367] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785367] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785447] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785702] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785702] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.785831] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.786340] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 750.786507] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.798814] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.799066] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.799352] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.799410] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 750.800584] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3eb39f3-60ce-41bb-a07b-a17043c8dff5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.814454] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8648785-d3ff-4f1f-ad39-f86632163eb4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.834502] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8dd9353-b1bc-414e-b9dd-59a0937ad67e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.843170] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a36dd7-e123-4de1-a0f2-6d6e3debfff3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.882923] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180748MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 750.883074] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.883308] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.963197] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c2867798-9109-4f85-ae60-3830a711f21f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.963387] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a51d9480-1aa1-48c9-a05c-943589d6a224 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.963492] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.963651] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 1e482ed7-9c9f-4713-abde-291417686a78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.963818] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.963946] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.964086] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.964211] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.964329] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.964529] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 750.964669] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 751.120320] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2a949f6-f2a9-47ea-a7ad-37ab1efec93b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.129387] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8283663f-2566-42f9-90fb-31a81a3da2fd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.168036] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb30f095-a387-499d-9a11-b8d4961db4d9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.178078] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a14d8bdf-c3c1-4db0-af28-99886a5c6ba6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.194424] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 751.217147] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 751.253048] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 751.253048] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.224659] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.224944] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.225126] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.225218] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 759.911123] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "8ddc70e6-ec6f-4740-8109-6ba2c5d00536" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.913594] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "8ddc70e6-ec6f-4740-8109-6ba2c5d00536" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.473223] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.473617] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 761.815413] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "1000d79b-b491-4071-8ab0-aac90dac6b51" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 761.815684] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.278771] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "69980b41-9514-4d97-aa75-ea68dd05b241" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.279293] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "69980b41-9514-4d97-aa75-ea68dd05b241" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.043885] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "49292f00-1457-438b-b5b7-2ac35dd464d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.043885] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "49292f00-1457-438b-b5b7-2ac35dd464d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.299028] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "87ef9733-e8d6-429e-b23f-8b8aadef784c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.300348] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "87ef9733-e8d6-429e-b23f-8b8aadef784c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 764.887536] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f4fd7ac3-4900-4e59-88d7-2523d6dd78ec tempest-ServerPasswordTestJSON-958540676 tempest-ServerPasswordTestJSON-958540676-project-member] Acquiring lock "2f050e13-5621-4dda-ade1-cfbef017e57e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 764.887741] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f4fd7ac3-4900-4e59-88d7-2523d6dd78ec tempest-ServerPasswordTestJSON-958540676 tempest-ServerPasswordTestJSON-958540676-project-member] Lock "2f050e13-5621-4dda-ade1-cfbef017e57e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.899621] env[67270]: DEBUG oslo_concurrency.lockutils [None req-71d402b3-595b-487e-b083-c7ea072f93d0 tempest-ServersNegativeTestJSON-834909547 tempest-ServersNegativeTestJSON-834909547-project-member] Acquiring lock "4a1a791f-36f3-48af-9792-4a9eaeba26c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.899621] env[67270]: DEBUG oslo_concurrency.lockutils [None req-71d402b3-595b-487e-b083-c7ea072f93d0 tempest-ServersNegativeTestJSON-834909547 tempest-ServersNegativeTestJSON-834909547-project-member] Lock "4a1a791f-36f3-48af-9792-4a9eaeba26c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.337521] env[67270]: DEBUG oslo_concurrency.lockutils [None req-02016afc-c690-44b1-ae7f-e4f0679a6a37 tempest-ServerTagsTestJSON-710292736 tempest-ServerTagsTestJSON-710292736-project-member] Acquiring lock "907dfc72-e766-4a24-a4e7-df762db37824" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.337619] env[67270]: DEBUG oslo_concurrency.lockutils [None req-02016afc-c690-44b1-ae7f-e4f0679a6a37 tempest-ServerTagsTestJSON-710292736 tempest-ServerTagsTestJSON-710292736-project-member] Lock "907dfc72-e766-4a24-a4e7-df762db37824" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.959936] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2496d48d-9e7c-41af-ac0c-261720e759a6 tempest-ImagesOneServerNegativeTestJSON-359832043 tempest-ImagesOneServerNegativeTestJSON-359832043-project-member] Acquiring lock "cbe3ecc4-3c5b-4749-a21c-c0376583c4aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.959936] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2496d48d-9e7c-41af-ac0c-261720e759a6 tempest-ImagesOneServerNegativeTestJSON-359832043 tempest-ImagesOneServerNegativeTestJSON-359832043-project-member] Lock "cbe3ecc4-3c5b-4749-a21c-c0376583c4aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 767.840352] env[67270]: DEBUG oslo_concurrency.lockutils [None req-94475fdc-a8d0-4259-961d-5e3d6a6a61b6 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Acquiring lock "6546bb93-d032-4b32-b42f-49bbf36b8e82" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.840589] env[67270]: DEBUG oslo_concurrency.lockutils [None req-94475fdc-a8d0-4259-961d-5e3d6a6a61b6 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Lock "6546bb93-d032-4b32-b42f-49bbf36b8e82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.740644] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f9750147-a710-49b3-96cf-b0b3248c9e82 tempest-VolumesAdminNegativeTest-1789479060 tempest-VolumesAdminNegativeTest-1789479060-project-member] Acquiring lock "f42f9cc0-c33a-4bdc-b16c-8dec61896b27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 768.740955] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f9750147-a710-49b3-96cf-b0b3248c9e82 tempest-VolumesAdminNegativeTest-1789479060 tempest-VolumesAdminNegativeTest-1789479060-project-member] Lock "f42f9cc0-c33a-4bdc-b16c-8dec61896b27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 772.330328] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6ac2fbe-ce0b-436e-8245-b1b738d351c2 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Acquiring lock "2a6c8de3-8974-4533-a474-c4242fd735c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.330651] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6ac2fbe-ce0b-436e-8245-b1b738d351c2 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Lock "2a6c8de3-8974-4533-a474-c4242fd735c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 772.413933] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Acquiring lock "4e53a7b7-7194-4ceb-abef-5d0779effbfb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.414214] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Lock "4e53a7b7-7194-4ceb-abef-5d0779effbfb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 772.451967] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Acquiring lock "4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.452747] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Lock "4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 774.225370] env[67270]: DEBUG oslo_concurrency.lockutils [None req-7ea8e4d8-434a-4dbd-971a-6a8af1221e03 tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Acquiring lock "ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 774.225634] env[67270]: DEBUG oslo_concurrency.lockutils [None req-7ea8e4d8-434a-4dbd-971a-6a8af1221e03 tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Lock "ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 775.984035] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2d09057f-9b62-40d7-8664-1194617e51eb tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Acquiring lock "a9aaa31c-5228-4210-b3c0-ca8c5a8c6213" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.984336] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2d09057f-9b62-40d7-8664-1194617e51eb tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Lock "a9aaa31c-5228-4210-b3c0-ca8c5a8c6213" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 776.762869] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cba3a315-e09d-4ecf-9e44-4a01627e2758 tempest-ServersTestManualDisk-516412660 tempest-ServersTestManualDisk-516412660-project-member] Acquiring lock "4dce8f09-ce7e-419c-90b4-48ee54d8c604" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 776.763815] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cba3a315-e09d-4ecf-9e44-4a01627e2758 tempest-ServersTestManualDisk-516412660 tempest-ServersTestManualDisk-516412660-project-member] Lock "4dce8f09-ce7e-419c-90b4-48ee54d8c604" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.545200] env[67270]: DEBUG oslo_concurrency.lockutils [None req-93251977-26dd-41aa-b8e3-b10604cd7e16 tempest-ServerActionsTestOtherA-1897363833 tempest-ServerActionsTestOtherA-1897363833-project-member] Acquiring lock "c372287f-35e3-402a-9841-6f55ea471d3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.545535] env[67270]: DEBUG oslo_concurrency.lockutils [None req-93251977-26dd-41aa-b8e3-b10604cd7e16 tempest-ServerActionsTestOtherA-1897363833 tempest-ServerActionsTestOtherA-1897363833-project-member] Lock "c372287f-35e3-402a-9841-6f55ea471d3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.926434] env[67270]: DEBUG oslo_concurrency.lockutils [None req-ad659196-710f-478e-a478-1981e05dc130 tempest-ServerShowV254Test-191532395 tempest-ServerShowV254Test-191532395-project-member] Acquiring lock "2de2d5d9-2644-408a-8957-2c169b2793ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.926818] env[67270]: DEBUG oslo_concurrency.lockutils [None req-ad659196-710f-478e-a478-1981e05dc130 tempest-ServerShowV254Test-191532395 tempest-ServerShowV254Test-191532395-project-member] Lock "2de2d5d9-2644-408a-8957-2c169b2793ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.969061] env[67270]: WARNING oslo_vmware.rw_handles [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 782.969061] env[67270]: ERROR oslo_vmware.rw_handles [ 782.969589] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 782.971147] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 782.971418] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Copying Virtual Disk [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/26c326a8-20aa-495a-89a5-4d0476888342/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 782.973987] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed3e0bc6-6f13-4a65-97fe-a57a762453a3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.982016] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Waiting for the task: (returnval){ [ 782.982016] env[67270]: value = "task-4110595" [ 782.982016] env[67270]: _type = "Task" [ 782.982016] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 782.992830] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Task: {'id': task-4110595, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 783.498041] env[67270]: DEBUG oslo_vmware.exceptions [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 783.498041] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 783.498041] env[67270]: ERROR nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 783.498041] env[67270]: Faults: ['InvalidArgument'] [ 783.498041] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] Traceback (most recent call last): [ 783.498041] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 783.498041] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] yield resources [ 783.498041] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 783.498041] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self.driver.spawn(context, instance, image_meta, [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self._fetch_image_if_missing(context, vi) [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] image_cache(vi, tmp_image_ds_loc) [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] vm_util.copy_virtual_disk( [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] session._wait_for_task(vmdk_copy_task) [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return self.wait_for_task(task_ref) [ 783.498354] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return evt.wait() [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] result = hub.switch() [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return self.greenlet.switch() [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self.f(*self.args, **self.kw) [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] raise exceptions.translate_fault(task_info.error) [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] Faults: ['InvalidArgument'] [ 783.498639] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] [ 783.498639] env[67270]: INFO nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Terminating instance [ 783.501041] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 783.501041] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 783.501041] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 783.501496] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 783.501605] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa2d2829-a84a-4052-9b50-f42f08957f02 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.504633] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4082af0-9a29-4939-893e-62a73749e752 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.514787] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 783.516259] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-765306c2-d4a7-44ad-ae22-69dad5ff9584 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.518012] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 783.518213] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 783.518874] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-66266f6d-990a-4fea-81a9-395ad9adb0d5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.525801] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Waiting for the task: (returnval){ [ 783.525801] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5293b957-3b4b-30ca-c1ff-b670ac221615" [ 783.525801] env[67270]: _type = "Task" [ 783.525801] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 783.536388] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5293b957-3b4b-30ca-c1ff-b670ac221615, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 783.600679] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 783.600946] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 783.601089] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Deleting the datastore file [datastore1] c2867798-9109-4f85-ae60-3830a711f21f {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 783.601346] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ca2e474b-6f67-4c6e-ab7d-be0d9912857c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.608938] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Waiting for the task: (returnval){ [ 783.608938] env[67270]: value = "task-4110597" [ 783.608938] env[67270]: _type = "Task" [ 783.608938] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 783.621408] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Task: {'id': task-4110597, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 784.041564] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 784.041564] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Creating directory with path [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 784.042260] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5791a8ae-e588-4639-8f59-a923e01eb6aa {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.057358] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Created directory with path [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 784.059267] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Fetch image to [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 784.059267] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 784.059267] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-307a2800-992c-44b1-b051-54b3abf9d78a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.072912] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6014f87c-881b-43a3-b334-0736d5f8c00d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.087836] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0a07e69-4d53-4fd8-8e42-c3969612d698 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.126022] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d925a5b-7a7d-4364-97d8-dae6dd021c69 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.136434] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-40e1ada2-ee21-4933-9426-52cb01d80c7b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.138607] env[67270]: DEBUG oslo_vmware.api [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Task: {'id': task-4110597, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082892} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 784.138861] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 784.139151] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 784.139226] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 784.139387] env[67270]: INFO nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Took 0.64 seconds to destroy the instance on the hypervisor. [ 784.142071] env[67270]: DEBUG nova.compute.claims [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 784.142071] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.142071] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.175623] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 784.220143] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d7e1c0a-a54e-436f-8bdb-6cc2c6f2b3fb tempest-ServerMetadataNegativeTestJSON-173231563 tempest-ServerMetadataNegativeTestJSON-173231563-project-member] Acquiring lock "65422c06-b1cf-4868-8f38-391b08038fc9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.220489] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d7e1c0a-a54e-436f-8bdb-6cc2c6f2b3fb tempest-ServerMetadataNegativeTestJSON-173231563 tempest-ServerMetadataNegativeTestJSON-173231563-project-member] Lock "65422c06-b1cf-4868-8f38-391b08038fc9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.271644] env[67270]: DEBUG oslo_vmware.rw_handles [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 784.336190] env[67270]: DEBUG oslo_vmware.rw_handles [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 784.336298] env[67270]: DEBUG oslo_vmware.rw_handles [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 784.780729] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1120f9f-5223-49d5-a21c-7235e2a5eba2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.792719] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e7b156-795b-4da9-bf35-c208d6e5bd31 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.831822] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d6e45f8-4250-4d1b-8041-bc5e2c4d75eb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.839714] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce84fe65-bb91-4f85-b763-588162baf866 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.856879] env[67270]: DEBUG nova.compute.provider_tree [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 784.867104] env[67270]: DEBUG nova.scheduler.client.report [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 784.891983] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.750s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.892568] env[67270]: ERROR nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 784.892568] env[67270]: Faults: ['InvalidArgument'] [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] Traceback (most recent call last): [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self.driver.spawn(context, instance, image_meta, [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self._fetch_image_if_missing(context, vi) [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] image_cache(vi, tmp_image_ds_loc) [ 784.892568] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] vm_util.copy_virtual_disk( [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] session._wait_for_task(vmdk_copy_task) [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return self.wait_for_task(task_ref) [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return evt.wait() [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] result = hub.switch() [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] return self.greenlet.switch() [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 784.892944] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] self.f(*self.args, **self.kw) [ 784.893297] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 784.893297] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] raise exceptions.translate_fault(task_info.error) [ 784.893297] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 784.893297] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] Faults: ['InvalidArgument'] [ 784.893297] env[67270]: ERROR nova.compute.manager [instance: c2867798-9109-4f85-ae60-3830a711f21f] [ 784.895347] env[67270]: DEBUG nova.compute.utils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 784.897575] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Build of instance c2867798-9109-4f85-ae60-3830a711f21f was re-scheduled: A specified parameter was not correct: fileType [ 784.897575] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 784.897575] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 784.897729] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 784.897865] env[67270]: DEBUG nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 784.898039] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 785.467951] env[67270]: DEBUG nova.network.neutron [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 785.483034] env[67270]: INFO nova.compute.manager [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: c2867798-9109-4f85-ae60-3830a711f21f] Took 0.58 seconds to deallocate network for instance. [ 785.592839] env[67270]: INFO nova.scheduler.client.report [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Deleted allocations for instance c2867798-9109-4f85-ae60-3830a711f21f [ 785.615990] env[67270]: DEBUG oslo_concurrency.lockutils [None req-84faecef-d972-4427-a860-8274158efa16 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "c2867798-9109-4f85-ae60-3830a711f21f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 107.086s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 785.647206] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 785.714944] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 785.715247] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 785.716777] env[67270]: INFO nova.compute.claims [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 786.226131] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0db094a8-dbc7-4c3e-a957-f82de3fd0607 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.234918] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c119e66-2ad2-4018-a386-98cdb43c5eb8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.269327] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f872d26c-4c5f-4c58-a46b-1ea33dc1d063 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.277956] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd33c210-3984-44b6-88b6-95640609304c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.293071] env[67270]: DEBUG nova.compute.provider_tree [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 786.303927] env[67270]: DEBUG nova.scheduler.client.report [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 786.333077] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.618s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 786.333719] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 786.387548] env[67270]: DEBUG nova.compute.utils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 786.388963] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Not allocating networking since 'none' was specified. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 786.402214] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 786.482029] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 786.502846] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 786.503108] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 786.503262] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 786.503439] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 786.503579] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 786.503721] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 786.503922] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 786.504187] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 786.504375] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 786.505963] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 786.505963] env[67270]: DEBUG nova.virt.hardware [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 786.505963] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85fb0310-7129-4821-acd0-ae94eb669599 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.514778] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-301cbb8b-9420-46b0-833d-d85d60ef0ce9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.538018] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance VIF info [] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 786.545637] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Creating folder: Project (07b483296d4d4a01a20bf5807e0b6631). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 786.546033] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1ddf2c94-2996-49a4-9df2-2772ab13f388 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.557525] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Created folder: Project (07b483296d4d4a01a20bf5807e0b6631) in parent group-v814248. [ 786.557903] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Creating folder: Instances. Parent ref: group-v814286. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 786.558093] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-330e7878-7ce4-4ee0-a5f7-750c5bd47f2b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.568031] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Created folder: Instances in parent group-v814286. [ 786.568031] env[67270]: DEBUG oslo.service.loopingcall [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 786.568347] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 786.568347] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5cc5d966-f2fc-4d8e-b9a1-ba67526dddac {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.585910] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 786.585910] env[67270]: value = "task-4110600" [ 786.585910] env[67270]: _type = "Task" [ 786.585910] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 786.595209] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110600, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 786.939694] env[67270]: DEBUG oslo_concurrency.lockutils [None req-810fdb95-eb3e-4084-9138-9f2aed01baef tempest-ServerRescueTestJSONUnderV235-461983559 tempest-ServerRescueTestJSONUnderV235-461983559-project-member] Acquiring lock "e976fd9e-95a3-4564-9bd6-08ee3f15a188" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 786.939977] env[67270]: DEBUG oslo_concurrency.lockutils [None req-810fdb95-eb3e-4084-9138-9f2aed01baef tempest-ServerRescueTestJSONUnderV235-461983559 tempest-ServerRescueTestJSONUnderV235-461983559-project-member] Lock "e976fd9e-95a3-4564-9bd6-08ee3f15a188" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 787.095650] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110600, 'name': CreateVM_Task, 'duration_secs': 0.283306} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 787.095829] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 787.096304] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 787.096460] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 787.096946] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 787.097053] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9097ea5e-1a98-4f4d-885d-b0fa42537756 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.102303] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Waiting for the task: (returnval){ [ 787.102303] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52ec2cfe-5f58-0e1c-d66b-93e549899ecd" [ 787.102303] env[67270]: _type = "Task" [ 787.102303] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 787.110973] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52ec2cfe-5f58-0e1c-d66b-93e549899ecd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 787.630305] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 787.634525] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 787.634792] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 787.709474] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2b1d5d17-3319-4ea8-bc94-48cdb7ad94d1 tempest-ServersV294TestFqdnHostnames-831052524 tempest-ServersV294TestFqdnHostnames-831052524-project-member] Acquiring lock "662bb470-e6ed-4a37-bb23-74a0a36dff0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 787.709745] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2b1d5d17-3319-4ea8-bc94-48cdb7ad94d1 tempest-ServersV294TestFqdnHostnames-831052524 tempest-ServersV294TestFqdnHostnames-831052524-project-member] Lock "662bb470-e6ed-4a37-bb23-74a0a36dff0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.183324] env[67270]: DEBUG oslo_concurrency.lockutils [None req-fe63a5af-f7d2-4520-b9c3-1fdc47f0f886 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Acquiring lock "25cc189a-383b-450c-810d-85ea2b48fdca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.183776] env[67270]: DEBUG oslo_concurrency.lockutils [None req-fe63a5af-f7d2-4520-b9c3-1fdc47f0f886 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "25cc189a-383b-450c-810d-85ea2b48fdca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.758768] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.758768] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 810.758768] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 810.778606] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.778764] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.778926] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784071] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784328] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784474] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784604] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784728] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784847] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 810.784967] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 810.785484] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.785666] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.785799] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 811.759056] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.759056] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.769369] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 811.769722] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.769958] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.770195] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 811.771301] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50f06e90-b1b0-42b2-94e3-86b360204dfe {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.780786] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fd7aa49-8c5a-4619-997a-7baaa1a62e45 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.795949] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-948f25a0-8567-4d87-be95-83fd342a26d3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.803559] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6643bb8-e9c4-46cb-80fc-1d2626686fec {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.834946] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180789MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 811.835046] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 811.835251] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 811.901073] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a51d9480-1aa1-48c9-a05c-943589d6a224 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901257] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901387] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 1e482ed7-9c9f-4713-abde-291417686a78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901510] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901630] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901750] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901868] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.901986] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.902127] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8ddc70e6-ec6f-4740-8109-6ba2c5d00536 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 811.928453] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 811.953312] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 811.968026] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 69980b41-9514-4d97-aa75-ea68dd05b241 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 811.978459] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 49292f00-1457-438b-b5b7-2ac35dd464d2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 811.988823] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 87ef9733-e8d6-429e-b23f-8b8aadef784c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 811.999940] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2f050e13-5621-4dda-ade1-cfbef017e57e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.011320] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a1a791f-36f3-48af-9792-4a9eaeba26c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.021779] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 907dfc72-e766-4a24-a4e7-df762db37824 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.033030] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance cbe3ecc4-3c5b-4749-a21c-c0376583c4aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.047882] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance f42f9cc0-c33a-4bdc-b16c-8dec61896b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.059204] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4e53a7b7-7194-4ceb-abef-5d0779effbfb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.070477] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2a6c8de3-8974-4533-a474-c4242fd735c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.082041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.094013] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.105502] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a9aaa31c-5228-4210-b3c0-ca8c5a8c6213 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.116811] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4dce8f09-ce7e-419c-90b4-48ee54d8c604 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.127711] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c372287f-35e3-402a-9841-6f55ea471d3d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.139019] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de2d5d9-2644-408a-8957-2c169b2793ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.148895] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65422c06-b1cf-4868-8f38-391b08038fc9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.159657] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance e976fd9e-95a3-4564-9bd6-08ee3f15a188 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.170789] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 662bb470-e6ed-4a37-bb23-74a0a36dff0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.182955] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 25cc189a-383b-450c-810d-85ea2b48fdca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 812.183260] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 812.183416] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 812.569075] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-604a8ccf-1049-4698-baa7-842bf95eab4f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.577246] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9356ea07-d86e-49ad-8954-ff269e2d8403 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.609095] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9138d6bd-871a-46b1-8f14-4cb4b2cb299b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.617420] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f3e1722-4847-405d-bb46-c9523cca5f40 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.631113] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 812.641869] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 812.657151] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 812.657387] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.822s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.652567] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 813.652848] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 813.757660] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 813.757906] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 832.983938] env[67270]: WARNING oslo_vmware.rw_handles [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 832.983938] env[67270]: ERROR oslo_vmware.rw_handles [ 832.984568] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 832.986466] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 832.986797] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Copying Virtual Disk [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/35da4e85-5e5e-4a13-87e0-54db8c1b1539/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 832.987233] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4bb01498-e5c7-452c-a688-7de5b3d9b7d9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.996958] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Waiting for the task: (returnval){ [ 832.996958] env[67270]: value = "task-4110601" [ 832.996958] env[67270]: _type = "Task" [ 832.996958] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 833.006079] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Task: {'id': task-4110601, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 833.508206] env[67270]: DEBUG oslo_vmware.exceptions [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 833.508537] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 833.509161] env[67270]: ERROR nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 833.509161] env[67270]: Faults: ['InvalidArgument'] [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Traceback (most recent call last): [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] yield resources [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self.driver.spawn(context, instance, image_meta, [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self._vmops.spawn(context, instance, image_meta, injected_files, [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self._fetch_image_if_missing(context, vi) [ 833.509161] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] image_cache(vi, tmp_image_ds_loc) [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] vm_util.copy_virtual_disk( [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] session._wait_for_task(vmdk_copy_task) [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return self.wait_for_task(task_ref) [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return evt.wait() [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] result = hub.switch() [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 833.509469] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return self.greenlet.switch() [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self.f(*self.args, **self.kw) [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] raise exceptions.translate_fault(task_info.error) [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Faults: ['InvalidArgument'] [ 833.509778] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] [ 833.509778] env[67270]: INFO nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Terminating instance [ 833.511083] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 833.511342] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 833.511588] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad75b0d3-14af-4db5-bf34-03e5572eaa60 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.513885] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 833.514090] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 833.514852] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002ab17b-ae7c-4d0c-b855-c55f178d087a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.522104] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 833.522426] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c232fde2-90d5-408d-bcc5-381e45a95cee {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.524915] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 833.525101] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 833.526074] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e3c1b90b-cde4-4e9e-b307-ba6fe73279ce {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.531366] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for the task: (returnval){ [ 833.531366] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52fb113d-097f-1f02-58e2-65c9dbc4c245" [ 833.531366] env[67270]: _type = "Task" [ 833.531366] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 833.540592] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52fb113d-097f-1f02-58e2-65c9dbc4c245, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 833.599123] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 833.599316] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 833.599501] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Deleting the datastore file [datastore1] 1e482ed7-9c9f-4713-abde-291417686a78 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 833.599774] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-59047383-a318-480f-b18c-33264a99ced0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.606729] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Waiting for the task: (returnval){ [ 833.606729] env[67270]: value = "task-4110603" [ 833.606729] env[67270]: _type = "Task" [ 833.606729] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 833.615496] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Task: {'id': task-4110603, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 834.041865] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 834.042251] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Creating directory with path [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 834.042402] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-37944b62-8245-4268-a30d-5bb922372794 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.056457] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Created directory with path [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 834.056670] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Fetch image to [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 834.056842] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 834.057671] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be0b7dd-586c-46c4-bbf7-dd89dbf81c27 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.065668] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7966a3-e4b5-4e0a-9c53-adb87c4d4bf2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.076142] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37d61635-bab0-46fd-a4a2-fe41071499e5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.111520] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ebebb8c-f6be-416b-8086-335198268d00 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.121082] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9e7b7c2f-ce12-46a1-88bb-7c8e39318a5d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.122959] env[67270]: DEBUG oslo_vmware.api [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Task: {'id': task-4110603, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084987} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 834.123226] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 834.123407] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 834.123575] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 834.123747] env[67270]: INFO nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Took 0.61 seconds to destroy the instance on the hypervisor. [ 834.125941] env[67270]: DEBUG nova.compute.claims [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 834.126167] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 834.126400] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 834.156449] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 834.207349] env[67270]: DEBUG oslo_vmware.rw_handles [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 834.266807] env[67270]: DEBUG oslo_vmware.rw_handles [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 834.267025] env[67270]: DEBUG oslo_vmware.rw_handles [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 834.616154] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-379736f2-689a-45de-b13f-94918c864d9f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.622891] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4898eddc-189e-4cc4-be4b-1283b662e1ed {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.653818] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8cc7fbc-cb1f-4cab-97b6-706e276e59af {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.662247] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02c54d79-70e3-4860-84a0-e304e33e82c7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.677032] env[67270]: DEBUG nova.compute.provider_tree [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 834.686904] env[67270]: DEBUG nova.scheduler.client.report [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 834.700994] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.574s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 834.701580] env[67270]: ERROR nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.701580] env[67270]: Faults: ['InvalidArgument'] [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Traceback (most recent call last): [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self.driver.spawn(context, instance, image_meta, [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self._vmops.spawn(context, instance, image_meta, injected_files, [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self._fetch_image_if_missing(context, vi) [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] image_cache(vi, tmp_image_ds_loc) [ 834.701580] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] vm_util.copy_virtual_disk( [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] session._wait_for_task(vmdk_copy_task) [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return self.wait_for_task(task_ref) [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return evt.wait() [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] result = hub.switch() [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] return self.greenlet.switch() [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 834.701943] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] self.f(*self.args, **self.kw) [ 834.702319] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 834.702319] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] raise exceptions.translate_fault(task_info.error) [ 834.702319] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 834.702319] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Faults: ['InvalidArgument'] [ 834.702319] env[67270]: ERROR nova.compute.manager [instance: 1e482ed7-9c9f-4713-abde-291417686a78] [ 834.702319] env[67270]: DEBUG nova.compute.utils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 834.704042] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Build of instance 1e482ed7-9c9f-4713-abde-291417686a78 was re-scheduled: A specified parameter was not correct: fileType [ 834.704042] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 834.704177] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 834.704337] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 834.704493] env[67270]: DEBUG nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 834.704653] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 835.014070] env[67270]: DEBUG nova.network.neutron [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 835.025092] env[67270]: INFO nova.compute.manager [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] [instance: 1e482ed7-9c9f-4713-abde-291417686a78] Took 0.32 seconds to deallocate network for instance. [ 835.126656] env[67270]: INFO nova.scheduler.client.report [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Deleted allocations for instance 1e482ed7-9c9f-4713-abde-291417686a78 [ 835.150924] env[67270]: DEBUG oslo_concurrency.lockutils [None req-462407f8-1900-4026-8afd-6839acf17845 tempest-ServerDiagnosticsNegativeTest-1847614768 tempest-ServerDiagnosticsNegativeTest-1847614768-project-member] Lock "1e482ed7-9c9f-4713-abde-291417686a78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 152.501s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 835.166182] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 835.220719] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 835.220719] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 835.222205] env[67270]: INFO nova.compute.claims [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 835.700687] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a7ecaf-3858-4be1-9eac-8764543e1c87 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.708685] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05b133da-6d05-4fb7-80ec-0ff08a9700d4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.739933] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf06cfd0-0b67-47c3-ab8d-d98127b9b160 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.747907] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95e332b8-ca5b-440f-8ce4-5ca4ac918158 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.762440] env[67270]: DEBUG nova.compute.provider_tree [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 835.771064] env[67270]: DEBUG nova.scheduler.client.report [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 835.785421] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.565s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 835.786127] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 835.819350] env[67270]: DEBUG nova.compute.utils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 835.821414] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 835.821661] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 835.836379] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 835.889925] env[67270]: DEBUG nova.policy [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd12eac146664232beed2ac62d43219c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ca50dc5cf0c4e069fe0f592cc4c3bb5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 835.898862] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 835.921704] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:11Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 835.921942] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 835.922108] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 835.922294] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 835.922439] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 835.922584] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 835.922826] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 835.923011] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 835.923188] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 835.923352] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 835.923524] env[67270]: DEBUG nova.virt.hardware [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 835.924402] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59ecaf30-41fc-4f65-944b-23c37e43e8ed {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.932792] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a4c5a66-d923-4e81-82c2-314b09f96213 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.222033] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Successfully created port: 1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 837.006735] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Successfully updated port: 1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 837.023487] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 837.023647] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquired lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 837.023797] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 837.099662] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 837.424363] env[67270]: DEBUG nova.compute.manager [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Received event network-vif-plugged-1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 837.424718] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Acquiring lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.424798] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.424931] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.425102] env[67270]: DEBUG nova.compute.manager [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] No waiting events found dispatching network-vif-plugged-1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 837.425311] env[67270]: WARNING nova.compute.manager [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Received unexpected event network-vif-plugged-1b6b43f7-23b2-4088-9933-ff0d804226e0 for instance with vm_state building and task_state spawning. [ 837.425418] env[67270]: DEBUG nova.compute.manager [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Received event network-changed-1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 837.425565] env[67270]: DEBUG nova.compute.manager [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Refreshing instance network info cache due to event network-changed-1b6b43f7-23b2-4088-9933-ff0d804226e0. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 837.425733] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Acquiring lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 837.576968] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Updating instance_info_cache with network_info: [{"id": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "address": "fa:16:3e:61:9b:c0", "network": {"id": "351fe3c0-7b18-4429-bd74-1b026ab9fe40", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-678866969-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ca50dc5cf0c4e069fe0f592cc4c3bb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a27fd90b-16a5-43af-bede-ae36762ece00", "external-id": "nsx-vlan-transportzone-197", "segmentation_id": 197, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6b43f7-23", "ovs_interfaceid": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 837.588986] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Releasing lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 837.589329] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance network_info: |[{"id": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "address": "fa:16:3e:61:9b:c0", "network": {"id": "351fe3c0-7b18-4429-bd74-1b026ab9fe40", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-678866969-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ca50dc5cf0c4e069fe0f592cc4c3bb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a27fd90b-16a5-43af-bede-ae36762ece00", "external-id": "nsx-vlan-transportzone-197", "segmentation_id": 197, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6b43f7-23", "ovs_interfaceid": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 837.589635] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Acquired lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 837.589809] env[67270]: DEBUG nova.network.neutron [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Refreshing network info cache for port 1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 837.591673] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:61:9b:c0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a27fd90b-16a5-43af-bede-ae36762ece00', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1b6b43f7-23b2-4088-9933-ff0d804226e0', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 837.599857] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Creating folder: Project (7ca50dc5cf0c4e069fe0f592cc4c3bb5). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 837.601094] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-394fb796-613b-418d-9fa8-97a5b32d8a2e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.614982] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Created folder: Project (7ca50dc5cf0c4e069fe0f592cc4c3bb5) in parent group-v814248. [ 837.615146] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Creating folder: Instances. Parent ref: group-v814289. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 837.615383] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b973d179-cf13-4b16-b30a-591c261bd187 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.625154] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Created folder: Instances in parent group-v814289. [ 837.625384] env[67270]: DEBUG oslo.service.loopingcall [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 837.625623] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 837.625770] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-790dd414-c792-470a-b47e-45bd6b1790dd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.657819] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 837.657819] env[67270]: value = "task-4110606" [ 837.657819] env[67270]: _type = "Task" [ 837.657819] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 837.667752] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110606, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 837.990656] env[67270]: DEBUG nova.network.neutron [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Updated VIF entry in instance network info cache for port 1b6b43f7-23b2-4088-9933-ff0d804226e0. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 837.991032] env[67270]: DEBUG nova.network.neutron [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Updating instance_info_cache with network_info: [{"id": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "address": "fa:16:3e:61:9b:c0", "network": {"id": "351fe3c0-7b18-4429-bd74-1b026ab9fe40", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-678866969-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ca50dc5cf0c4e069fe0f592cc4c3bb5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a27fd90b-16a5-43af-bede-ae36762ece00", "external-id": "nsx-vlan-transportzone-197", "segmentation_id": 197, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6b43f7-23", "ovs_interfaceid": "1b6b43f7-23b2-4088-9933-ff0d804226e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 838.000519] env[67270]: DEBUG oslo_concurrency.lockutils [req-35da6790-6b87-48c8-bb6b-87b0ad029d0f req-a7f59555-196c-454b-a43f-71ba24cc4af1 service nova] Releasing lock "refresh_cache-2de499d5-2eb3-4138-8c6b-41fb94ff27eb" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 838.167811] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110606, 'name': CreateVM_Task, 'duration_secs': 0.334163} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 838.167975] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 838.168690] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 838.168852] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 838.169198] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 838.169447] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6294fbe9-50e2-48fa-af48-f8f17ddbbab9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 838.174559] env[67270]: DEBUG oslo_vmware.api [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Waiting for the task: (returnval){ [ 838.174559] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5209d6c5-b98e-5248-e6ea-c3c9279e91bb" [ 838.174559] env[67270]: _type = "Task" [ 838.174559] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 838.182747] env[67270]: DEBUG oslo_vmware.api [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5209d6c5-b98e-5248-e6ea-c3c9279e91bb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 838.685871] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 838.686325] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 838.686662] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.760234] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.760234] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 870.760234] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 870.780805] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.780805] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.780805] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.780805] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.780805] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.781077] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.781327] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.781423] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.781560] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 870.781688] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 870.782203] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.758596] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.758309] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.758700] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.758770] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 872.758887] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 872.770985] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.771299] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.771470] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.771668] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 872.772812] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c3ecd86-dd45-4448-bffb-dce9f8005caf {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.782202] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fb288ec-f3d8-4ee3-96ae-da9ad0ffd9b4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.796947] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d516b717-8432-4f72-8053-594bc74f753d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.805027] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b641b8d3-3088-47fe-b35e-a5091bbabf6b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.836031] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180806MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 872.836214] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 872.836502] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 872.901679] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a51d9480-1aa1-48c9-a05c-943589d6a224 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.901860] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.901991] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902135] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902255] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902387] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902503] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902623] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8ddc70e6-ec6f-4740-8109-6ba2c5d00536 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.902710] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 872.914276] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.925881] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 69980b41-9514-4d97-aa75-ea68dd05b241 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.936326] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 49292f00-1457-438b-b5b7-2ac35dd464d2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.946834] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 87ef9733-e8d6-429e-b23f-8b8aadef784c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.957425] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2f050e13-5621-4dda-ade1-cfbef017e57e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.968105] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a1a791f-36f3-48af-9792-4a9eaeba26c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.979748] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 907dfc72-e766-4a24-a4e7-df762db37824 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 872.991945] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance cbe3ecc4-3c5b-4749-a21c-c0376583c4aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.003737] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance f42f9cc0-c33a-4bdc-b16c-8dec61896b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.014280] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4e53a7b7-7194-4ceb-abef-5d0779effbfb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.030119] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2a6c8de3-8974-4533-a474-c4242fd735c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.041229] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.074384] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.085256] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a9aaa31c-5228-4210-b3c0-ca8c5a8c6213 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.098378] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4dce8f09-ce7e-419c-90b4-48ee54d8c604 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.109161] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c372287f-35e3-402a-9841-6f55ea471d3d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.121174] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de2d5d9-2644-408a-8957-2c169b2793ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.133350] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65422c06-b1cf-4868-8f38-391b08038fc9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.146133] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance e976fd9e-95a3-4564-9bd6-08ee3f15a188 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.179608] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 662bb470-e6ed-4a37-bb23-74a0a36dff0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.191614] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 25cc189a-383b-450c-810d-85ea2b48fdca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 873.191882] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 873.192041] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 873.561609] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56b08e5e-acd1-46d3-a431-9e6d39fa26a8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.569691] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83a90913-660d-4a02-b9fe-4618f1c22b5c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.600501] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d496304-2490-411f-bd06-9c9281cc7f0d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.608806] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4060dde3-9995-4077-bcce-b53d9cf00dd7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.624695] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 873.635034] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 873.648516] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 873.648716] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.812s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 874.644081] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.644466] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.670056] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.670263] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 880.209263] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "4a086288-b773-40aa-b39a-e3f3b9784a05" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 880.373448] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "a51d9480-1aa1-48c9-a05c-943589d6a224" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.871982] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 882.021046] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.865105] env[67270]: WARNING oslo_vmware.rw_handles [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 883.865105] env[67270]: ERROR oslo_vmware.rw_handles [ 883.865734] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 883.867527] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 883.867817] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Copying Virtual Disk [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/06839709-f784-4266-98b5-5d4075ff8441/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 883.868120] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aff3b6c7-5cda-4200-b4b6-4037ae3d898b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.877090] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for the task: (returnval){ [ 883.877090] env[67270]: value = "task-4110607" [ 883.877090] env[67270]: _type = "Task" [ 883.877090] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 883.886810] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Task: {'id': task-4110607, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 884.387155] env[67270]: DEBUG oslo_vmware.exceptions [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 884.387448] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 884.387993] env[67270]: ERROR nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.387993] env[67270]: Faults: ['InvalidArgument'] [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Traceback (most recent call last): [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] yield resources [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self.driver.spawn(context, instance, image_meta, [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self._vmops.spawn(context, instance, image_meta, injected_files, [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self._fetch_image_if_missing(context, vi) [ 884.387993] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] image_cache(vi, tmp_image_ds_loc) [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] vm_util.copy_virtual_disk( [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] session._wait_for_task(vmdk_copy_task) [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return self.wait_for_task(task_ref) [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return evt.wait() [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] result = hub.switch() [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 884.388378] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return self.greenlet.switch() [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self.f(*self.args, **self.kw) [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] raise exceptions.translate_fault(task_info.error) [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Faults: ['InvalidArgument'] [ 884.388734] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] [ 884.388734] env[67270]: INFO nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Terminating instance [ 884.389814] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 884.390042] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 884.390285] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4c577750-8a7e-4435-9ccf-582896e5e01a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.392650] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 884.392836] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 884.393582] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab83f2e3-3721-4f49-8ae3-bdb565ff7323 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.401159] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 884.401385] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9f17107a-da70-4e70-9527-4f0f6c915a0f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.403965] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 884.404157] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 884.405138] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-04dd4c35-259a-44bc-b7fc-9d8352676339 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.411135] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for the task: (returnval){ [ 884.411135] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52e6478a-caf5-05c6-1b2c-01b1fc6645ab" [ 884.411135] env[67270]: _type = "Task" [ 884.411135] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 884.420227] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52e6478a-caf5-05c6-1b2c-01b1fc6645ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 884.483670] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 884.483670] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 884.483670] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Deleting the datastore file [datastore1] a51d9480-1aa1-48c9-a05c-943589d6a224 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 884.483670] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-92ceb7b0-205e-402d-bc16-717b779af6dc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.491181] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for the task: (returnval){ [ 884.491181] env[67270]: value = "task-4110609" [ 884.491181] env[67270]: _type = "Task" [ 884.491181] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 884.499595] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Task: {'id': task-4110609, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 884.921591] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 884.921953] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Creating directory with path [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 884.922211] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a98528be-de11-47ec-96c8-82162c39d233 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.934936] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Created directory with path [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 884.935161] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Fetch image to [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 884.935333] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 884.936259] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de4af3b7-d99b-49bc-9f27-63a7ed99f836 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.943495] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89ef71de-7cd2-4ad7-9085-90371e864c90 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.953184] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e074df0-bf5a-49b5-b127-21480393dc7d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.985283] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-164ef01f-746d-45c1-9ce7-d56bf1ed8bb1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.995117] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7c27ea06-2138-4667-af84-e2b25d54fe78 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.002668] env[67270]: DEBUG oslo_vmware.api [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Task: {'id': task-4110609, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079299} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 885.002805] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 885.002967] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 885.003157] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 885.003360] env[67270]: INFO nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Took 0.61 seconds to destroy the instance on the hypervisor. [ 885.005613] env[67270]: DEBUG nova.compute.claims [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 885.005787] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.005996] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.019776] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 885.074326] env[67270]: DEBUG oslo_vmware.rw_handles [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 885.134033] env[67270]: DEBUG oslo_vmware.rw_handles [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 885.134283] env[67270]: DEBUG oslo_vmware.rw_handles [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 885.473753] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1311da5d-6cc9-430a-bab6-eaaedd3dea8f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.482043] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ced3a326-bc5b-4003-9041-c992339d0c61 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.513201] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d983215f-0ef4-45b6-a506-260bee21da5b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.521544] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1959cbb-b7d3-442a-9e7b-57f1cfa74184 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.535679] env[67270]: DEBUG nova.compute.provider_tree [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 885.544849] env[67270]: DEBUG nova.scheduler.client.report [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 885.558853] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.553s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.559418] env[67270]: ERROR nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 885.559418] env[67270]: Faults: ['InvalidArgument'] [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Traceback (most recent call last): [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self.driver.spawn(context, instance, image_meta, [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self._vmops.spawn(context, instance, image_meta, injected_files, [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self._fetch_image_if_missing(context, vi) [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] image_cache(vi, tmp_image_ds_loc) [ 885.559418] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] vm_util.copy_virtual_disk( [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] session._wait_for_task(vmdk_copy_task) [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return self.wait_for_task(task_ref) [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return evt.wait() [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] result = hub.switch() [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] return self.greenlet.switch() [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 885.559760] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] self.f(*self.args, **self.kw) [ 885.560114] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 885.560114] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] raise exceptions.translate_fault(task_info.error) [ 885.560114] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 885.560114] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Faults: ['InvalidArgument'] [ 885.560114] env[67270]: ERROR nova.compute.manager [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] [ 885.560416] env[67270]: DEBUG nova.compute.utils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 885.561875] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Build of instance a51d9480-1aa1-48c9-a05c-943589d6a224 was re-scheduled: A specified parameter was not correct: fileType [ 885.561875] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 885.562298] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 885.562474] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 885.562627] env[67270]: DEBUG nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 885.562791] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 885.814571] env[67270]: DEBUG nova.network.neutron [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 885.826088] env[67270]: INFO nova.compute.manager [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Took 0.26 seconds to deallocate network for instance. [ 885.918294] env[67270]: INFO nova.scheduler.client.report [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Deleted allocations for instance a51d9480-1aa1-48c9-a05c-943589d6a224 [ 885.939026] env[67270]: DEBUG oslo_concurrency.lockutils [None req-58c016a8-1abd-4a7a-befa-227a1cb87a63 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.324s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.939026] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 5.564s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.939026] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Acquiring lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.939026] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.939469] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.941083] env[67270]: INFO nova.compute.manager [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Terminating instance [ 885.942910] env[67270]: DEBUG nova.compute.manager [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 885.943293] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 885.943873] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c6b9a2c0-8389-4246-901a-55a08a6ef79e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.953747] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04f2779d-9cad-47c7-9730-2e771a77d683 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.966495] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.988319] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a51d9480-1aa1-48c9-a05c-943589d6a224 could not be found. [ 885.988544] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 885.988744] env[67270]: INFO nova.compute.manager [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Took 0.05 seconds to destroy the instance on the hypervisor. [ 885.988996] env[67270]: DEBUG oslo.service.loopingcall [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 885.989234] env[67270]: DEBUG nova.compute.manager [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 885.989331] env[67270]: DEBUG nova.network.neutron [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 886.019691] env[67270]: DEBUG nova.network.neutron [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 886.021911] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.022159] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.023665] env[67270]: INFO nova.compute.claims [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 886.029429] env[67270]: INFO nova.compute.manager [-] [instance: a51d9480-1aa1-48c9-a05c-943589d6a224] Took 0.04 seconds to deallocate network for instance. [ 886.140643] env[67270]: DEBUG oslo_concurrency.lockutils [None req-d2ff9f29-b1cc-4099-8e99-ff95fcb6f496 tempest-TenantUsagesTestJSON-1792553892 tempest-TenantUsagesTestJSON-1792553892-project-member] Lock "a51d9480-1aa1-48c9-a05c-943589d6a224" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.463893] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fc59003-4712-46e4-8247-c1b69359cce9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.474341] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a79ad6b2-516f-4ffc-b8e1-b3367f2a869d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.505967] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eccf977b-c62d-40b2-b293-17d1c55e1316 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.514609] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-436cc736-597a-4ab8-816b-9fee17eb1806 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.531818] env[67270]: DEBUG nova.compute.provider_tree [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 886.540709] env[67270]: DEBUG nova.scheduler.client.report [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 886.554687] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.532s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.555083] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 886.591115] env[67270]: DEBUG nova.compute.utils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 886.592609] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 886.592784] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 886.601393] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 886.641514] env[67270]: INFO nova.virt.block_device [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Booting with volume acb318dc-85b5-4555-a128-55b3932ac7fc at /dev/sda [ 886.663488] env[67270]: DEBUG nova.policy [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '026d2e024a0e4319a88b9b31b0ec243d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e6afd7b877a407e8c006a668e763477', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 886.689831] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a28b480f-dbd5-4877-b6ce-b562b320b321 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.698891] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939b25f0-a951-4fb7-8e45-72787a2129eb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.729110] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a12a17e0-5548-4015-875f-64f57e7809b1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.737178] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdff282f-68c7-4889-bea6-9977da2b090d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.766128] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4c05e4f-1e35-44a3-beb3-fadeef5a7dd8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.773628] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d63ff5cd-7245-4a00-977f-a68543ede176 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.792150] env[67270]: DEBUG nova.virt.block_device [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updating existing volume attachment record: e5c8f829-a829-46f6-aa3c-f2266c5d9807 {{(pid=67270) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 887.030850] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 887.031425] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 887.031633] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 887.031788] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 887.031968] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 887.032127] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 887.032274] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 887.032481] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 887.032782] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 887.032896] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 887.033072] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 887.033294] env[67270]: DEBUG nova.virt.hardware [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 887.034228] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45a1bd1-2bc9-4376-80e8-de6180dae060 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.043193] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44c5ac8a-6190-4ca9-a997-6c810b161d83 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.059896] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Successfully created port: e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 888.028145] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Successfully updated port: e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 888.046279] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 888.046279] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquired lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 888.046279] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 888.114112] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 888.195249] env[67270]: DEBUG nova.compute.manager [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Received event network-vif-plugged-e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 888.197388] env[67270]: DEBUG oslo_concurrency.lockutils [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] Acquiring lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 888.197388] env[67270]: DEBUG oslo_concurrency.lockutils [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 888.197388] env[67270]: DEBUG oslo_concurrency.lockutils [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 888.197388] env[67270]: DEBUG nova.compute.manager [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] No waiting events found dispatching network-vif-plugged-e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 888.197945] env[67270]: WARNING nova.compute.manager [req-7de40aa7-1295-49b9-948f-6e159aa2978b req-6f6fe441-01b5-4ca0-a49d-3e904582f05c service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Received unexpected event network-vif-plugged-e4c3118a-623f-4306-9869-8309fdebd171 for instance with vm_state building and task_state spawning. [ 888.351585] env[67270]: DEBUG nova.network.neutron [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updating instance_info_cache with network_info: [{"id": "e4c3118a-623f-4306-9869-8309fdebd171", "address": "fa:16:3e:e7:6d:05", "network": {"id": "15169a3a-b8fe-4f1c-9ab6-e9a01b6dcfb7", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-617962643-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e6afd7b877a407e8c006a668e763477", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bfbfc55d-8126-40dd-998e-8600ea92f97c", "external-id": "nsx-vlan-transportzone-650", "segmentation_id": 650, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4c3118a-62", "ovs_interfaceid": "e4c3118a-623f-4306-9869-8309fdebd171", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 888.364173] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Releasing lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 888.364497] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance network_info: |[{"id": "e4c3118a-623f-4306-9869-8309fdebd171", "address": "fa:16:3e:e7:6d:05", "network": {"id": "15169a3a-b8fe-4f1c-9ab6-e9a01b6dcfb7", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-617962643-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e6afd7b877a407e8c006a668e763477", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bfbfc55d-8126-40dd-998e-8600ea92f97c", "external-id": "nsx-vlan-transportzone-650", "segmentation_id": 650, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4c3118a-62", "ovs_interfaceid": "e4c3118a-623f-4306-9869-8309fdebd171", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 888.364882] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e7:6d:05', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bfbfc55d-8126-40dd-998e-8600ea92f97c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e4c3118a-623f-4306-9869-8309fdebd171', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 888.372625] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Creating folder: Project (9e6afd7b877a407e8c006a668e763477). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 888.373508] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b1d156a8-65a9-4263-a852-c7e3bd6a1c13 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.389218] env[67270]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 888.389413] env[67270]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67270) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 888.389959] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Folder already exists: Project (9e6afd7b877a407e8c006a668e763477). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 888.390177] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Creating folder: Instances. Parent ref: group-v814282. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 888.390402] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e71b53a8-21a2-4716-a2e7-934076e6de44 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.401095] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Created folder: Instances in parent group-v814282. [ 888.401427] env[67270]: DEBUG oslo.service.loopingcall [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 888.401526] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 888.401747] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-61063625-cbb2-4a60-90b7-c7cffceb9769 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.422221] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 888.422221] env[67270]: value = "task-4110612" [ 888.422221] env[67270]: _type = "Task" [ 888.422221] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 888.430314] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110612, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 888.932770] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110612, 'name': CreateVM_Task, 'duration_secs': 0.308781} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 888.932947] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 888.933584] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'guest_format': None, 'boot_index': 0, 'disk_bus': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814285', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'name': 'volume-acb318dc-85b5-4555-a128-55b3932ac7fc', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1000d79b-b491-4071-8ab0-aac90dac6b51', 'attached_at': '', 'detached_at': '', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'serial': 'acb318dc-85b5-4555-a128-55b3932ac7fc'}, 'attachment_id': 'e5c8f829-a829-46f6-aa3c-f2266c5d9807', 'device_type': None, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=67270) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 888.933807] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Root volume attach. Driver type: vmdk {{(pid=67270) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 888.934597] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82edfc10-a7f2-40ee-9320-0f8726658c19 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.943022] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2820e0dc-6ebe-4204-941f-ef38e923af3b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.949671] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0b65f3e-cdfa-4633-baec-ebf2b587dd22 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.956506] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-727b7f22-9573-4080-9a01-48c57f6f83fc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.964450] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 888.964450] env[67270]: value = "task-4110613" [ 888.964450] env[67270]: _type = "Task" [ 888.964450] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 888.972827] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 889.477991] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 42%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 889.978401] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 54%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 890.236327] env[67270]: DEBUG nova.compute.manager [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Received event network-changed-e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 890.236327] env[67270]: DEBUG nova.compute.manager [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Refreshing instance network info cache due to event network-changed-e4c3118a-623f-4306-9869-8309fdebd171. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 890.236606] env[67270]: DEBUG oslo_concurrency.lockutils [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] Acquiring lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 890.236762] env[67270]: DEBUG oslo_concurrency.lockutils [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] Acquired lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 890.236798] env[67270]: DEBUG nova.network.neutron [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Refreshing network info cache for port e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 890.482818] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 67%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 890.710042] env[67270]: DEBUG nova.network.neutron [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updated VIF entry in instance network info cache for port e4c3118a-623f-4306-9869-8309fdebd171. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 890.710470] env[67270]: DEBUG nova.network.neutron [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updating instance_info_cache with network_info: [{"id": "e4c3118a-623f-4306-9869-8309fdebd171", "address": "fa:16:3e:e7:6d:05", "network": {"id": "15169a3a-b8fe-4f1c-9ab6-e9a01b6dcfb7", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-617962643-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e6afd7b877a407e8c006a668e763477", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bfbfc55d-8126-40dd-998e-8600ea92f97c", "external-id": "nsx-vlan-transportzone-650", "segmentation_id": 650, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4c3118a-62", "ovs_interfaceid": "e4c3118a-623f-4306-9869-8309fdebd171", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 890.724013] env[67270]: DEBUG oslo_concurrency.lockutils [req-6d5eec42-8591-44e8-b4e8-9baff0273b55 req-cb1ce54c-a56c-4758-8cdc-74e261848210 service nova] Releasing lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 890.980146] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 82%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 891.478050] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 97%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 891.978726] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task} progress is 98%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.479586] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110613, 'name': RelocateVM_Task, 'duration_secs': 3.213878} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 892.479871] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Volume attach. Driver type: vmdk {{(pid=67270) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 892.480084] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814285', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'name': 'volume-acb318dc-85b5-4555-a128-55b3932ac7fc', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1000d79b-b491-4071-8ab0-aac90dac6b51', 'attached_at': '', 'detached_at': '', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'serial': 'acb318dc-85b5-4555-a128-55b3932ac7fc'} {{(pid=67270) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 892.480933] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039aa0dd-35ea-4d5b-83c6-113f97a8146f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.498595] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4502af5-e988-4fb0-86f2-104c281daf9e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.524882] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Reconfiguring VM instance instance-0000000e to attach disk [datastore1] volume-acb318dc-85b5-4555-a128-55b3932ac7fc/volume-acb318dc-85b5-4555-a128-55b3932ac7fc.vmdk or device None with type thin {{(pid=67270) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 892.525009] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6f6f9220-1948-4c88-9665-96e1f57c024b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.544918] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 892.544918] env[67270]: value = "task-4110614" [ 892.544918] env[67270]: _type = "Task" [ 892.544918] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 892.553710] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110614, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 893.054783] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110614, 'name': ReconfigVM_Task, 'duration_secs': 0.300363} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 893.055169] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Reconfigured VM instance instance-0000000e to attach disk [datastore1] volume-acb318dc-85b5-4555-a128-55b3932ac7fc/volume-acb318dc-85b5-4555-a128-55b3932ac7fc.vmdk or device None with type thin {{(pid=67270) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 893.059789] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4bf18b9b-bad6-483b-a726-614fc4fc4c4e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.076210] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 893.076210] env[67270]: value = "task-4110615" [ 893.076210] env[67270]: _type = "Task" [ 893.076210] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 893.085012] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110615, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 893.586206] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110615, 'name': ReconfigVM_Task, 'duration_secs': 0.147505} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 893.586542] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814285', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'name': 'volume-acb318dc-85b5-4555-a128-55b3932ac7fc', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1000d79b-b491-4071-8ab0-aac90dac6b51', 'attached_at': '', 'detached_at': '', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'serial': 'acb318dc-85b5-4555-a128-55b3932ac7fc'} {{(pid=67270) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 893.587255] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-1ae61eba-836e-4c6d-8753-55f599ccc5d0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.594466] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 893.594466] env[67270]: value = "task-4110616" [ 893.594466] env[67270]: _type = "Task" [ 893.594466] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 893.602772] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110616, 'name': Rename_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 894.104408] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110616, 'name': Rename_Task, 'duration_secs': 0.134277} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 894.104747] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Powering on the VM {{(pid=67270) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 894.104911] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-bec31bd7-52ce-4bd5-90d6-982e6e884270 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 894.111187] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 894.111187] env[67270]: value = "task-4110617" [ 894.111187] env[67270]: _type = "Task" [ 894.111187] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 894.120303] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110617, 'name': PowerOnVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 894.621212] env[67270]: DEBUG oslo_vmware.api [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110617, 'name': PowerOnVM_Task, 'duration_secs': 0.450815} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 894.622030] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Powered on the VM {{(pid=67270) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 894.622030] env[67270]: INFO nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Took 7.59 seconds to spawn the instance on the hypervisor. [ 894.622030] env[67270]: DEBUG nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Checking state {{(pid=67270) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 894.622783] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-304034b4-c04b-4fc3-abcd-feba21f06af1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 894.672025] env[67270]: INFO nova.compute.manager [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Took 8.66 seconds to build instance. [ 894.682833] env[67270]: DEBUG oslo_concurrency.lockutils [None req-00cadc87-679d-4be1-8fab-8c461b46a881 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 132.867s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 894.694203] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 894.744255] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 894.744510] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 894.746235] env[67270]: INFO nova.compute.claims [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 895.156098] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c07749d-4ecf-40b1-8aec-68051389140d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.164351] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8acddced-cd85-49ac-98a4-4d5149dd100a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.196163] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfcac6e3-0991-4aba-ae5b-6b7c61e3ccb9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.208179] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b2a8181-7c78-4950-8e5a-05b42de21ce6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.223203] env[67270]: DEBUG nova.compute.provider_tree [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 895.224911] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 895.231377] env[67270]: DEBUG nova.scheduler.client.report [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 895.250116] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.503s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 895.250116] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 895.290776] env[67270]: DEBUG nova.compute.utils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 895.292297] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 895.292513] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 895.303083] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 895.373081] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 895.395638] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 895.395895] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 895.396067] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 895.396254] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 895.396402] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 895.396550] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 895.396799] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 895.397009] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 895.397185] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 895.397351] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 895.397538] env[67270]: DEBUG nova.virt.hardware [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 895.398695] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8f79713-388a-4dd1-8a81-3286588aca5c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.407687] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa708fa-5261-4fcb-9f50-890d5cd41cbb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.670039] env[67270]: DEBUG nova.policy [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '82a5520168774cfbb485fe542a08145b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97c1a11018334dd78bf3cf918d205130', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 896.325815] env[67270]: DEBUG nova.compute.manager [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Received event network-changed-e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 896.326214] env[67270]: DEBUG nova.compute.manager [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Refreshing instance network info cache due to event network-changed-e4c3118a-623f-4306-9869-8309fdebd171. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 896.326308] env[67270]: DEBUG oslo_concurrency.lockutils [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] Acquiring lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 896.326454] env[67270]: DEBUG oslo_concurrency.lockutils [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] Acquired lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 896.326614] env[67270]: DEBUG nova.network.neutron [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Refreshing network info cache for port e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 896.358114] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Successfully created port: 28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 896.679030] env[67270]: DEBUG nova.network.neutron [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updated VIF entry in instance network info cache for port e4c3118a-623f-4306-9869-8309fdebd171. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 896.679249] env[67270]: DEBUG nova.network.neutron [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updating instance_info_cache with network_info: [{"id": "e4c3118a-623f-4306-9869-8309fdebd171", "address": "fa:16:3e:e7:6d:05", "network": {"id": "15169a3a-b8fe-4f1c-9ab6-e9a01b6dcfb7", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-617962643-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9e6afd7b877a407e8c006a668e763477", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bfbfc55d-8126-40dd-998e-8600ea92f97c", "external-id": "nsx-vlan-transportzone-650", "segmentation_id": 650, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4c3118a-62", "ovs_interfaceid": "e4c3118a-623f-4306-9869-8309fdebd171", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 896.692488] env[67270]: DEBUG oslo_concurrency.lockutils [req-19b6f823-5f7c-487b-b2e9-86383b8c583b req-0081dc42-8c89-4eda-be03-e362688dcba8 service nova] Releasing lock "refresh_cache-1000d79b-b491-4071-8ab0-aac90dac6b51" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 897.201826] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Successfully updated port: 28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 897.216919] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 897.217076] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquired lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 897.217460] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 897.269886] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 897.695483] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Updating instance_info_cache with network_info: [{"id": "28efe591-fb15-49e8-99fd-767d024c657a", "address": "fa:16:3e:e3:5f:89", "network": {"id": "8fd84958-ac82-4407-9b6b-02fe9759b048", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1614034573-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97c1a11018334dd78bf3cf918d205130", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28efe591-fb", "ovs_interfaceid": "28efe591-fb15-49e8-99fd-767d024c657a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 897.712252] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Releasing lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 897.712252] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Instance network_info: |[{"id": "28efe591-fb15-49e8-99fd-767d024c657a", "address": "fa:16:3e:e3:5f:89", "network": {"id": "8fd84958-ac82-4407-9b6b-02fe9759b048", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1614034573-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97c1a11018334dd78bf3cf918d205130", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28efe591-fb", "ovs_interfaceid": "28efe591-fb15-49e8-99fd-767d024c657a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 897.712492] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e3:5f:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aef08290-001a-4ae8-aff0-1889e2211389', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '28efe591-fb15-49e8-99fd-767d024c657a', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 897.719826] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Creating folder: Project (97c1a11018334dd78bf3cf918d205130). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 897.720392] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e01496f-6af9-4c6d-a538-ea302fd96d54 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.731285] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Created folder: Project (97c1a11018334dd78bf3cf918d205130) in parent group-v814248. [ 897.731485] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Creating folder: Instances. Parent ref: group-v814294. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 897.731735] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b9a35af5-76e5-4d70-9b92-b9c97a647694 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.742087] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Created folder: Instances in parent group-v814294. [ 897.742527] env[67270]: DEBUG oslo.service.loopingcall [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 897.742527] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 897.742929] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-36541302-0c2d-4840-944d-7062ebbce4a4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.762941] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 897.762941] env[67270]: value = "task-4110620" [ 897.762941] env[67270]: _type = "Task" [ 897.762941] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.771861] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110620, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 898.273136] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110620, 'name': CreateVM_Task, 'duration_secs': 0.344836} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 898.273329] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 898.274450] env[67270]: DEBUG oslo_vmware.service [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf9d50a5-a975-497b-86a3-539247f23e5a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.280607] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 898.280809] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 898.281193] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 898.281435] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bfaf2076-d5a2-4244-9900-4813d1f7224b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.285872] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Waiting for the task: (returnval){ [ 898.285872] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]528fcaab-094f-ddb9-1a38-58a31855fe76" [ 898.285872] env[67270]: _type = "Task" [ 898.285872] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 898.293871] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]528fcaab-094f-ddb9-1a38-58a31855fe76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 898.388626] env[67270]: DEBUG nova.compute.manager [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Received event network-vif-plugged-28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 898.388895] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Acquiring lock "69980b41-9514-4d97-aa75-ea68dd05b241-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.389152] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Lock "69980b41-9514-4d97-aa75-ea68dd05b241-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.389423] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Lock "69980b41-9514-4d97-aa75-ea68dd05b241-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.389495] env[67270]: DEBUG nova.compute.manager [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] No waiting events found dispatching network-vif-plugged-28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 898.389657] env[67270]: WARNING nova.compute.manager [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Received unexpected event network-vif-plugged-28efe591-fb15-49e8-99fd-767d024c657a for instance with vm_state building and task_state spawning. [ 898.389813] env[67270]: DEBUG nova.compute.manager [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Received event network-changed-28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 898.389991] env[67270]: DEBUG nova.compute.manager [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Refreshing instance network info cache due to event network-changed-28efe591-fb15-49e8-99fd-767d024c657a. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 898.390189] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Acquiring lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 898.390331] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Acquired lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 898.390471] env[67270]: DEBUG nova.network.neutron [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Refreshing network info cache for port 28efe591-fb15-49e8-99fd-767d024c657a {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 898.643125] env[67270]: DEBUG nova.network.neutron [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Updated VIF entry in instance network info cache for port 28efe591-fb15-49e8-99fd-767d024c657a. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 898.643525] env[67270]: DEBUG nova.network.neutron [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Updating instance_info_cache with network_info: [{"id": "28efe591-fb15-49e8-99fd-767d024c657a", "address": "fa:16:3e:e3:5f:89", "network": {"id": "8fd84958-ac82-4407-9b6b-02fe9759b048", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1614034573-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97c1a11018334dd78bf3cf918d205130", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap28efe591-fb", "ovs_interfaceid": "28efe591-fb15-49e8-99fd-767d024c657a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 898.652371] env[67270]: DEBUG oslo_concurrency.lockutils [req-1f76413c-c8ef-4492-9909-0943cca222b8 req-141c6ad2-f683-4065-aa74-71572768e37a service nova] Releasing lock "refresh_cache-69980b41-9514-4d97-aa75-ea68dd05b241" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 898.796249] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 898.796572] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 898.796761] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 898.796905] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 898.797107] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 898.797501] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1e0de30-1987-4728-900f-9a8823bca67e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.815418] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 898.815614] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 898.816452] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6eb153f-15f1-48c0-b8b8-6b6c2e38108c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.823394] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98b85ab7-f177-48ed-9aaf-6374e8467d09 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.830223] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Waiting for the task: (returnval){ [ 898.830223] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]529643b8-b6fd-b5a0-ac33-4ffa4df14ce8" [ 898.830223] env[67270]: _type = "Task" [ 898.830223] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 898.839129] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]529643b8-b6fd-b5a0-ac33-4ffa4df14ce8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 898.872427] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.343501] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 899.343760] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Creating directory with path [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 899.343991] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6f00a3f4-2332-459a-9c2c-2ace9171fe50 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.367763] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Created directory with path [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 899.367985] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Fetch image to [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 899.368172] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 899.369038] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84211340-a6a6-4a6b-bbaa-7f41c2f3f231 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.376751] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3c2f85-a443-4017-a4ed-a0502c628238 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.387030] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c671f90-daf3-48a8-b85f-78645f7b2ffb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.420845] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-825d08c6-50c2-4962-a405-4343079b8d92 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.428188] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-56a79eae-544c-4cd0-a1ce-a25fa541d456 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.452239] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 899.466792] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.504468] env[67270]: DEBUG oslo_vmware.rw_handles [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 899.562076] env[67270]: DEBUG oslo_vmware.rw_handles [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 899.562355] env[67270]: DEBUG oslo_vmware.rw_handles [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 913.836457] env[67270]: INFO nova.compute.manager [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Rebuilding instance [ 913.869347] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'trusted_certs' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 913.881884] env[67270]: DEBUG nova.compute.manager [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Checking state {{(pid=67270) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 913.882760] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49df4f52-2776-473b-bf89-90688295ae97 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.922811] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'pci_requests' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 913.931724] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'pci_devices' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 913.942423] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'resources' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 913.949634] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'migration_context' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 913.956557] env[67270]: DEBUG nova.objects.instance [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Trying to apply a migration context that does not seem to be set for this instance {{(pid=67270) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 913.956989] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Powering off the VM {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 913.957688] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-7cb3c641-e484-4077-aafe-d59225db8064 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.966863] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 913.966863] env[67270]: value = "task-4110621" [ 913.966863] env[67270]: _type = "Task" [ 913.966863] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 913.976602] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110621, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.477908] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110621, 'name': PowerOffVM_Task, 'duration_secs': 0.190486} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 914.478222] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Powered off the VM {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 914.478916] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Powering off the VM {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 914.479170] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-c21719a8-5d1a-47b7-8ed7-92f815900c94 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.485649] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 914.485649] env[67270]: value = "task-4110622" [ 914.485649] env[67270]: _type = "Task" [ 914.485649] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 914.493493] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110622, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.997052] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] VM already powered off {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 914.997052] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Volume detach. Driver type: vmdk {{(pid=67270) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 914.997052] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814285', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'name': 'volume-acb318dc-85b5-4555-a128-55b3932ac7fc', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1000d79b-b491-4071-8ab0-aac90dac6b51', 'attached_at': '', 'detached_at': '', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'serial': 'acb318dc-85b5-4555-a128-55b3932ac7fc'} {{(pid=67270) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 914.997595] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f49ce46-2ee3-4997-8f0d-237f4eb4f9d7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.018175] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c32f279-0d6c-4f4c-81a7-da6e5542b912 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.025651] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-138bdc58-c80c-4390-a833-5d3afbe40453 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.043379] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3b2791f-7e99-4f73-be8e-46f48eff7cfd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.060447] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] The volume has not been displaced from its original location: [datastore1] volume-acb318dc-85b5-4555-a128-55b3932ac7fc/volume-acb318dc-85b5-4555-a128-55b3932ac7fc.vmdk. No consolidation needed. {{(pid=67270) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 915.067117] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Reconfiguring VM instance instance-0000000e to detach disk 2000 {{(pid=67270) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 915.067117] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-3d500c4a-8012-4a43-9370-8b26bc821ff7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.087061] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 915.087061] env[67270]: value = "task-4110623" [ 915.087061] env[67270]: _type = "Task" [ 915.087061] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 915.096017] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110623, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 915.599085] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110623, 'name': ReconfigVM_Task, 'duration_secs': 0.182469} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 915.599085] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Reconfigured VM instance instance-0000000e to detach disk 2000 {{(pid=67270) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 915.602572] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b13e176a-219c-4702-bd4d-cf8219989c35 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.619057] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 915.619057] env[67270]: value = "task-4110624" [ 915.619057] env[67270]: _type = "Task" [ 915.619057] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 915.629616] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110624, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 916.129961] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110624, 'name': ReconfigVM_Task, 'duration_secs': 0.116015} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 916.130275] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814285', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'name': 'volume-acb318dc-85b5-4555-a128-55b3932ac7fc', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '1000d79b-b491-4071-8ab0-aac90dac6b51', 'attached_at': '', 'detached_at': '', 'volume_id': 'acb318dc-85b5-4555-a128-55b3932ac7fc', 'serial': 'acb318dc-85b5-4555-a128-55b3932ac7fc'} {{(pid=67270) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 916.130544] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 916.131337] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8a7dc58-3da5-4685-9c73-8a4f088ac11c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.137987] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 916.138218] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4a3597ec-1efb-4af5-bb88-6cf9fb583316 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.198388] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 916.198639] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 916.198826] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Deleting the datastore file [datastore1] 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 916.199147] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-330c73f5-c2d1-43f4-af56-934b1f2fad0f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.205821] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for the task: (returnval){ [ 916.205821] env[67270]: value = "task-4110626" [ 916.205821] env[67270]: _type = "Task" [ 916.205821] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 916.213875] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110626, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 916.719031] env[67270]: DEBUG oslo_vmware.api [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Task: {'id': task-4110626, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077069} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 916.719325] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 916.719534] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 916.719713] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 916.771841] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Volume detach. Driver type: vmdk {{(pid=67270) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 916.773030] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2cfecbf3-bce1-4211-8fc4-01295e92bee6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.781165] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab310d15-6a06-461d-9910-897767505c7e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.811387] env[67270]: ERROR nova.compute.manager [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Failed to detach volume acb318dc-85b5-4555-a128-55b3932ac7fc from /dev/sda: nova.exception.InstanceNotFound: Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 could not be found. [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Traceback (most recent call last): [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self.driver.rebuild(**kwargs) [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise NotImplementedError() [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] NotImplementedError [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] During handling of the above exception, another exception occurred: [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Traceback (most recent call last): [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 916.811387] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self.driver.detach_volume(context, old_connection_info, [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] return self._volumeops.detach_volume(connection_info, instance) [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._detach_volume_vmdk(connection_info, instance) [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] stable_ref.fetch_moref(session) [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise exception.InstanceNotFound(instance_id=self._uuid) [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] nova.exception.InstanceNotFound: Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 could not be found. [ 916.811849] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.943224] env[67270]: DEBUG nova.compute.utils [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Build of instance 1000d79b-b491-4071-8ab0-aac90dac6b51 aborted: Failed to rebuild volume backed instance. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 916.945875] env[67270]: ERROR nova.compute.manager [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 1000d79b-b491-4071-8ab0-aac90dac6b51 aborted: Failed to rebuild volume backed instance. [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Traceback (most recent call last): [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self.driver.rebuild(**kwargs) [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise NotImplementedError() [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] NotImplementedError [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] During handling of the above exception, another exception occurred: [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Traceback (most recent call last): [ 916.945875] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._detach_root_volume(context, instance, root_bdm) [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] with excutils.save_and_reraise_exception(): [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self.force_reraise() [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise self.value [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self.driver.detach_volume(context, old_connection_info, [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] return self._volumeops.detach_volume(connection_info, instance) [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 916.946368] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._detach_volume_vmdk(connection_info, instance) [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] stable_ref.fetch_moref(session) [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise exception.InstanceNotFound(instance_id=self._uuid) [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] nova.exception.InstanceNotFound: Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 could not be found. [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] During handling of the above exception, another exception occurred: [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Traceback (most recent call last): [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 10738, in _error_out_instance_on_exception [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] yield [ 916.946873] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._do_rebuild_instance_with_claim( [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._do_rebuild_instance( [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._rebuild_default_impl(**kwargs) [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] self._rebuild_volume_backed_instance( [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] raise exception.BuildAbortException( [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] nova.exception.BuildAbortException: Build of instance 1000d79b-b491-4071-8ab0-aac90dac6b51 aborted: Failed to rebuild volume backed instance. [ 916.947303] env[67270]: ERROR nova.compute.manager [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] [ 917.049347] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.049629] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.376491] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e22924-ac4b-4fd5-9b04-d1bf3719f9d3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.384270] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c2c46a-7653-43cf-a094-87f6a91634fd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.414351] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e9b56dc-fccd-444e-862b-d9d372ee3579 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.422338] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9abc8b66-7115-449b-a25e-253cc1be9e37 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.435967] env[67270]: DEBUG nova.compute.provider_tree [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 917.444335] env[67270]: DEBUG nova.scheduler.client.report [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 917.462773] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.413s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 917.462972] env[67270]: INFO nova.compute.manager [None req-c93c9ed9-e577-47c6-b8d5-dfe6ece3916d tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Successfully reverted task state from rebuilding on failure for instance. [ 917.940024] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "1000d79b-b491-4071-8ab0-aac90dac6b51" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.940024] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.940305] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.940464] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.940655] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 917.942810] env[67270]: INFO nova.compute.manager [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Terminating instance [ 917.944949] env[67270]: DEBUG nova.compute.manager [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 917.945420] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7e6565b5-bf40-4cad-bab5-fc8a153f8e89 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.955664] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653bdb3a-c1fa-4b4f-9d52-c0776b3bdb16 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.986995] env[67270]: WARNING nova.virt.vmwareapi.driver [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 could not be found. [ 917.987271] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 917.987640] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f0121046-9854-4161-87be-c8d88e2474b5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.996646] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e2f266-6ea0-47ce-ae9c-6d20a124aee1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 918.028324] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1000d79b-b491-4071-8ab0-aac90dac6b51 could not be found. [ 918.028525] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 918.028770] env[67270]: INFO nova.compute.manager [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Took 0.08 seconds to destroy the instance on the hypervisor. [ 918.029135] env[67270]: DEBUG oslo.service.loopingcall [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 918.029385] env[67270]: DEBUG nova.compute.manager [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 918.029520] env[67270]: DEBUG nova.network.neutron [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 918.692281] env[67270]: DEBUG nova.network.neutron [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 918.708953] env[67270]: INFO nova.compute.manager [-] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Took 0.68 seconds to deallocate network for instance. [ 918.758979] env[67270]: DEBUG nova.compute.manager [req-8f5658a6-af9f-43bd-ad9e-693409edc3c5 req-97c1a95b-c911-4d37-9f15-b8c361dadb50 service nova] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Received event network-vif-deleted-e4c3118a-623f-4306-9869-8309fdebd171 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 918.780449] env[67270]: INFO nova.compute.manager [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Took 0.07 seconds to detach 1 volumes for instance. [ 918.784998] env[67270]: DEBUG nova.compute.manager [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Deleting volume: acb318dc-85b5-4555-a128-55b3932ac7fc {{(pid=67270) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 918.890860] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 918.891164] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 918.891388] env[67270]: DEBUG nova.objects.instance [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lazy-loading 'resources' on Instance uuid 1000d79b-b491-4071-8ab0-aac90dac6b51 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 919.353832] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce4fd971-8483-4ae5-b18c-0b4368c1b2ea {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.362314] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f92bbbcc-4249-4919-a3af-75cb0da20e53 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.394371] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07784855-a530-4807-9d27-aefe8af6016f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.402716] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0e344e8-e01e-4e69-bc89-bd9e0c767371 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 919.417458] env[67270]: DEBUG nova.compute.provider_tree [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 919.427838] env[67270]: DEBUG nova.scheduler.client.report [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 919.447472] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.556s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 919.519181] env[67270]: DEBUG oslo_concurrency.lockutils [None req-4819a39e-63b8-453a-9093-73c0f5df17d6 tempest-ServerActionsV293TestJSON-1008729719 tempest-ServerActionsV293TestJSON-1008729719-project-member] Lock "1000d79b-b491-4071-8ab0-aac90dac6b51" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.579s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.761425] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 929.761425] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Cleaning up deleted instances {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 929.779080] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] There are 2 instances to clean {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 929.779436] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 1000d79b-b491-4071-8ab0-aac90dac6b51] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 929.819761] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 929.856314] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 929.856314] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Cleaning up deleted instances with incomplete migration {{(pid=67270) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 929.864227] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.868279] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.868609] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 930.868647] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 930.888497] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.888712] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.888778] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.888891] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889029] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889143] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889268] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889417] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889574] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 930.889702] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 931.757578] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.757517] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.757761] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.757905] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 933.752913] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.757825] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.759859] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.759859] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.759859] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.770815] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.771260] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.771331] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.771565] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 933.773323] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-631556bf-f239-4076-963f-a78d543dc0f4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.782487] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57c15e20-cf8c-4dde-9617-26937148a3db {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.798765] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe06f74d-938d-47a6-9ce9-a322c407907c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.807479] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60429596-6545-4af9-b25c-2a7c95c56de7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.838402] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180816MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 933.838581] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.838764] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.905111] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905111] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905111] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905320] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905320] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905431] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905550] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8ddc70e6-ec6f-4740-8109-6ba2c5d00536 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905665] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.905778] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 69980b41-9514-4d97-aa75-ea68dd05b241 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.918600] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 49292f00-1457-438b-b5b7-2ac35dd464d2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.933897] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 87ef9733-e8d6-429e-b23f-8b8aadef784c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.945742] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2f050e13-5621-4dda-ade1-cfbef017e57e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.962943] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4a1a791f-36f3-48af-9792-4a9eaeba26c9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.974702] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 907dfc72-e766-4a24-a4e7-df762db37824 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.986911] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance cbe3ecc4-3c5b-4749-a21c-c0376583c4aa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.000108] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance f42f9cc0-c33a-4bdc-b16c-8dec61896b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.014543] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4e53a7b7-7194-4ceb-abef-5d0779effbfb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.027864] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2a6c8de3-8974-4533-a474-c4242fd735c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.040020] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.050545] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.065175] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a9aaa31c-5228-4210-b3c0-ca8c5a8c6213 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.077123] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 4dce8f09-ce7e-419c-90b4-48ee54d8c604 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.090665] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance c372287f-35e3-402a-9841-6f55ea471d3d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.105921] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 2de2d5d9-2644-408a-8957-2c169b2793ce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.118225] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65422c06-b1cf-4868-8f38-391b08038fc9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.130121] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance e976fd9e-95a3-4564-9bd6-08ee3f15a188 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.143884] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 662bb470-e6ed-4a37-bb23-74a0a36dff0c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.155201] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 25cc189a-383b-450c-810d-85ea2b48fdca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 934.155506] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 934.155695] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 934.231383] env[67270]: WARNING oslo_vmware.rw_handles [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 934.231383] env[67270]: ERROR oslo_vmware.rw_handles [ 934.231900] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 934.234124] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 934.234486] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Copying Virtual Disk [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/fd581990-d817-4665-802e-5d71078833ee/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 934.234994] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-37c75534-f87c-4a6b-8c2c-fcfcd7deb49f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.244481] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for the task: (returnval){ [ 934.244481] env[67270]: value = "task-4110628" [ 934.244481] env[67270]: _type = "Task" [ 934.244481] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 934.254797] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Task: {'id': task-4110628, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 934.269056] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Refreshing inventories for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 934.286880] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Updating ProviderTree inventory for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 934.286880] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Updating inventory in ProviderTree for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 934.299252] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Refreshing aggregate associations for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd, aggregates: None {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 934.320058] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Refreshing trait associations for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 934.703131] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7540805-31ae-442d-ba4c-f70b1076e436 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.711018] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-682448ef-984e-49ac-be3b-e4ce58ca6463 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.740722] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2adc4f12-10d3-4067-84d0-ab540b2c4d34 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.751982] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cce8622-5e75-4ed4-827e-8d8aa3ebcb0e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.761168] env[67270]: DEBUG oslo_vmware.exceptions [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 934.768791] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 934.769384] env[67270]: ERROR nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 934.769384] env[67270]: Faults: ['InvalidArgument'] [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Traceback (most recent call last): [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] yield resources [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self.driver.spawn(context, instance, image_meta, [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self._vmops.spawn(context, instance, image_meta, injected_files, [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self._fetch_image_if_missing(context, vi) [ 934.769384] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] image_cache(vi, tmp_image_ds_loc) [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] vm_util.copy_virtual_disk( [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] session._wait_for_task(vmdk_copy_task) [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return self.wait_for_task(task_ref) [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return evt.wait() [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] result = hub.switch() [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 934.769818] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return self.greenlet.switch() [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self.f(*self.args, **self.kw) [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] raise exceptions.translate_fault(task_info.error) [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Faults: ['InvalidArgument'] [ 934.770250] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] [ 934.770250] env[67270]: INFO nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Terminating instance [ 934.771426] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 934.772831] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 934.773046] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 934.773305] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 934.773494] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 934.774206] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-938bae94-3376-457f-aa0b-953697f141f5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.776922] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-820646f5-f486-43d2-a479-9935cdd1a21b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.780632] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 934.785824] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 934.786977] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd995f43-f6cc-4edd-a5db-2b5404f5e0d0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.788501] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 934.788672] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 934.789542] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6281f9dd-c105-46e7-a0de-5618e45e1318 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.795250] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for the task: (returnval){ [ 934.795250] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5238ae27-8971-c00b-2d5d-269218777c3f" [ 934.795250] env[67270]: _type = "Task" [ 934.795250] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 934.795789] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 934.795959] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.957s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 934.805669] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5238ae27-8971-c00b-2d5d-269218777c3f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 934.874811] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 934.874811] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 934.875051] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Deleting the datastore file [datastore1] 4a086288-b773-40aa-b39a-e3f3b9784a05 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 934.875230] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-52b492d1-5aed-410c-a2d8-0c0d7e8b4ab4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.882080] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for the task: (returnval){ [ 934.882080] env[67270]: value = "task-4110630" [ 934.882080] env[67270]: _type = "Task" [ 934.882080] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 934.890168] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Task: {'id': task-4110630, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 935.307499] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 935.307953] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Creating directory with path [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 935.308303] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c05d74c3-cdb6-4758-973c-3fa3f513b7e7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.321649] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Created directory with path [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 935.321848] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Fetch image to [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 935.322122] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 935.322926] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ecb3185-3d3e-462d-9cee-1a421abbc1e0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.330485] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1ba446-6293-49b5-98fb-df6a2f7dd714 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.340560] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69417621-0c9d-4889-8f17-8ac8cbd502f0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.373924] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff12f8bd-c08d-4e8a-8b04-2c9d9d2fbe3a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.381096] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-53011559-6908-4727-beda-987fdd219e52 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.392060] env[67270]: DEBUG oslo_vmware.api [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Task: {'id': task-4110630, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083331} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 935.392369] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 935.392607] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 935.392803] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 935.393037] env[67270]: INFO nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Took 0.62 seconds to destroy the instance on the hypervisor. [ 935.395246] env[67270]: DEBUG nova.compute.claims [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 935.395403] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 935.395621] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 935.415198] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 935.466499] env[67270]: DEBUG oslo_vmware.rw_handles [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 935.527417] env[67270]: DEBUG oslo_vmware.rw_handles [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 935.527622] env[67270]: DEBUG oslo_vmware.rw_handles [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 935.832232] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad43f8cb-615f-4877-b4cf-6c1eb7115cdd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.840155] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c16402-4d85-4887-88d2-aa909b17bce5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.870507] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ef12f35-b4c0-4bc4-80f6-820b98b7ad12 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.878222] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21924a27-067e-46b1-9e23-43513c2eadb4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 935.891758] env[67270]: DEBUG nova.compute.provider_tree [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 935.900468] env[67270]: DEBUG nova.scheduler.client.report [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 935.921056] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.525s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 935.921407] env[67270]: ERROR nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 935.921407] env[67270]: Faults: ['InvalidArgument'] [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Traceback (most recent call last): [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self.driver.spawn(context, instance, image_meta, [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self._vmops.spawn(context, instance, image_meta, injected_files, [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self._fetch_image_if_missing(context, vi) [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] image_cache(vi, tmp_image_ds_loc) [ 935.921407] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] vm_util.copy_virtual_disk( [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] session._wait_for_task(vmdk_copy_task) [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return self.wait_for_task(task_ref) [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return evt.wait() [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] result = hub.switch() [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] return self.greenlet.switch() [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 935.921779] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] self.f(*self.args, **self.kw) [ 935.922164] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 935.922164] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] raise exceptions.translate_fault(task_info.error) [ 935.922164] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 935.922164] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Faults: ['InvalidArgument'] [ 935.922164] env[67270]: ERROR nova.compute.manager [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] [ 935.922164] env[67270]: DEBUG nova.compute.utils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 935.923778] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Build of instance 4a086288-b773-40aa-b39a-e3f3b9784a05 was re-scheduled: A specified parameter was not correct: fileType [ 935.923778] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 935.924178] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 935.924354] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 935.924508] env[67270]: DEBUG nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 935.924700] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 936.487584] env[67270]: DEBUG nova.network.neutron [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 936.504749] env[67270]: INFO nova.compute.manager [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Took 0.58 seconds to deallocate network for instance. [ 936.598423] env[67270]: INFO nova.scheduler.client.report [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Deleted allocations for instance 4a086288-b773-40aa-b39a-e3f3b9784a05 [ 936.625929] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cf89ff06-a6a9-4dac-b845-c0ddae507acc tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 254.416s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 936.626652] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 56.417s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 936.626759] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Acquiring lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 936.627059] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 936.627338] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 936.629613] env[67270]: INFO nova.compute.manager [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Terminating instance [ 936.631375] env[67270]: DEBUG nova.compute.manager [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 936.631577] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 936.632879] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e058cd32-185f-4f72-abef-c2621bd7440c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 936.642223] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18e4caa1-f2cb-48f8-ac3a-8d2bf83b9f71 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 936.654469] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 936.675855] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a086288-b773-40aa-b39a-e3f3b9784a05 could not be found. [ 936.675914] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 936.676091] env[67270]: INFO nova.compute.manager [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Took 0.04 seconds to destroy the instance on the hypervisor. [ 936.676374] env[67270]: DEBUG oslo.service.loopingcall [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 936.676636] env[67270]: DEBUG nova.compute.manager [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 936.676734] env[67270]: DEBUG nova.network.neutron [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 936.708043] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 936.708348] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 936.709874] env[67270]: INFO nova.compute.claims [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 936.713109] env[67270]: DEBUG nova.network.neutron [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 936.720927] env[67270]: INFO nova.compute.manager [-] [instance: 4a086288-b773-40aa-b39a-e3f3b9784a05] Took 0.04 seconds to deallocate network for instance. [ 936.849854] env[67270]: DEBUG oslo_concurrency.lockutils [None req-99dc98b8-10cd-4f91-a43a-2a885c509755 tempest-ServerDiagnosticsTest-1963084671 tempest-ServerDiagnosticsTest-1963084671-project-member] Lock "4a086288-b773-40aa-b39a-e3f3b9784a05" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.224s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 937.134475] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3471c9-b116-4ac1-9906-2dbf871d199d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.143615] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c84f88f7-aa91-424f-9449-e3b68a8c00d1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.178253] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f2e67a5-5257-412f-9635-5358a164428f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.184193] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21a11a13-fbea-4f82-8a06-b63fa3159102 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.199115] env[67270]: DEBUG nova.compute.provider_tree [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 937.209138] env[67270]: DEBUG nova.scheduler.client.report [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 937.223615] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.515s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 937.224160] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 937.266473] env[67270]: DEBUG nova.compute.utils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 937.268051] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 937.268051] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 937.279292] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 937.354543] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 937.368076] env[67270]: DEBUG nova.policy [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11faecb4309b4924be49523d55557e62', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5ece736de8ef4e4580cf02fcc0e07a86', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 937.382560] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 937.382793] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 937.382948] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 937.383141] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 937.383289] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 937.383432] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 937.383637] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 937.383796] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 937.383961] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 937.388248] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 937.388475] env[67270]: DEBUG nova.virt.hardware [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 937.389424] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9cf6c2-509e-4ddb-b5ce-0e15c0c4cbe1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.398734] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3fa8fd1-caeb-451f-95cb-fc9d5d09dd3b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 937.755116] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Successfully created port: d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 938.551166] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Successfully updated port: d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 938.563851] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 938.563997] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquired lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 938.564166] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 938.612866] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 938.861554] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Updating instance_info_cache with network_info: [{"id": "d9a92fea-305d-4d67-b157-6b8348821277", "address": "fa:16:3e:38:b5:36", "network": {"id": "12618a1d-34e5-4061-9a87-bb76ac99721d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2061483586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5ece736de8ef4e4580cf02fcc0e07a86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd9a92fea-30", "ovs_interfaceid": "d9a92fea-305d-4d67-b157-6b8348821277", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 938.873604] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Releasing lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 938.873929] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance network_info: |[{"id": "d9a92fea-305d-4d67-b157-6b8348821277", "address": "fa:16:3e:38:b5:36", "network": {"id": "12618a1d-34e5-4061-9a87-bb76ac99721d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2061483586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5ece736de8ef4e4580cf02fcc0e07a86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd9a92fea-30", "ovs_interfaceid": "d9a92fea-305d-4d67-b157-6b8348821277", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 938.874330] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:b5:36', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4307c18-b235-43cd-bcd5-e226012d8ee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd9a92fea-305d-4d67-b157-6b8348821277', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 938.882128] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Creating folder: Project (5ece736de8ef4e4580cf02fcc0e07a86). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 938.882717] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e66a8b53-bb8c-40cf-85f7-c676318a9c24 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 938.894663] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Created folder: Project (5ece736de8ef4e4580cf02fcc0e07a86) in parent group-v814248. [ 938.894874] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Creating folder: Instances. Parent ref: group-v814297. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 938.895191] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2056e8a-a871-4359-9883-129def9222ea {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 938.906968] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Created folder: Instances in parent group-v814297. [ 938.907242] env[67270]: DEBUG oslo.service.loopingcall [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 938.907435] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 938.907641] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f4a9ea1-3f24-4b14-9655-7293e2e9811e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 938.928622] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 938.928622] env[67270]: value = "task-4110633" [ 938.928622] env[67270]: _type = "Task" [ 938.928622] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 938.937074] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110633, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 938.939469] env[67270]: DEBUG nova.compute.manager [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Received event network-vif-plugged-d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 938.939684] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Acquiring lock "49292f00-1457-438b-b5b7-2ac35dd464d2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.939887] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Lock "49292f00-1457-438b-b5b7-2ac35dd464d2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 938.940053] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Lock "49292f00-1457-438b-b5b7-2ac35dd464d2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 938.940220] env[67270]: DEBUG nova.compute.manager [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] No waiting events found dispatching network-vif-plugged-d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 938.940380] env[67270]: WARNING nova.compute.manager [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Received unexpected event network-vif-plugged-d9a92fea-305d-4d67-b157-6b8348821277 for instance with vm_state building and task_state spawning. [ 938.940532] env[67270]: DEBUG nova.compute.manager [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Received event network-changed-d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 938.940681] env[67270]: DEBUG nova.compute.manager [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Refreshing instance network info cache due to event network-changed-d9a92fea-305d-4d67-b157-6b8348821277. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 938.940925] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Acquiring lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 938.941109] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Acquired lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 938.941253] env[67270]: DEBUG nova.network.neutron [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Refreshing network info cache for port d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 939.259743] env[67270]: DEBUG nova.network.neutron [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Updated VIF entry in instance network info cache for port d9a92fea-305d-4d67-b157-6b8348821277. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 939.260600] env[67270]: DEBUG nova.network.neutron [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Updating instance_info_cache with network_info: [{"id": "d9a92fea-305d-4d67-b157-6b8348821277", "address": "fa:16:3e:38:b5:36", "network": {"id": "12618a1d-34e5-4061-9a87-bb76ac99721d", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-2061483586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5ece736de8ef4e4580cf02fcc0e07a86", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd9a92fea-30", "ovs_interfaceid": "d9a92fea-305d-4d67-b157-6b8348821277", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 939.271730] env[67270]: DEBUG oslo_concurrency.lockutils [req-7d206b6a-98c4-477f-b759-634a46424da7 req-99ef36a2-394d-4803-8240-2fb45b402818 service nova] Releasing lock "refresh_cache-49292f00-1457-438b-b5b7-2ac35dd464d2" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 939.440736] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110633, 'name': CreateVM_Task, 'duration_secs': 0.373694} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 939.440918] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 939.441765] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 939.441939] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 939.442327] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 939.442595] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc8abedf-7edc-4b07-9d66-276355212716 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.448278] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Waiting for the task: (returnval){ [ 939.448278] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52f6cceb-501e-ad84-163c-9ae57c5ef921" [ 939.448278] env[67270]: _type = "Task" [ 939.448278] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 939.456922] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52f6cceb-501e-ad84-163c-9ae57c5ef921, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 939.959354] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 939.959354] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 939.959354] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 948.024714] env[67270]: WARNING oslo_vmware.rw_handles [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 948.024714] env[67270]: ERROR oslo_vmware.rw_handles [ 948.024714] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 948.026100] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 948.026344] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Copying Virtual Disk [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore2] vmware_temp/0c19fe5f-24d2-4576-8ff1-e0889fcb630e/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 948.026613] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-65fffefe-33b0-4ba9-85a7-7ec7d6c92fa1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.035634] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Waiting for the task: (returnval){ [ 948.035634] env[67270]: value = "task-4110634" [ 948.035634] env[67270]: _type = "Task" [ 948.035634] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.044214] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Task: {'id': task-4110634, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.546819] env[67270]: DEBUG oslo_vmware.exceptions [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 948.547067] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.547611] env[67270]: ERROR nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.547611] env[67270]: Faults: ['InvalidArgument'] [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Traceback (most recent call last): [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] yield resources [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self.driver.spawn(context, instance, image_meta, [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self._fetch_image_if_missing(context, vi) [ 948.547611] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] image_cache(vi, tmp_image_ds_loc) [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] vm_util.copy_virtual_disk( [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] session._wait_for_task(vmdk_copy_task) [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return self.wait_for_task(task_ref) [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return evt.wait() [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] result = hub.switch() [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 948.548051] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return self.greenlet.switch() [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self.f(*self.args, **self.kw) [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] raise exceptions.translate_fault(task_info.error) [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Faults: ['InvalidArgument'] [ 948.548459] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] [ 948.548459] env[67270]: INFO nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Terminating instance [ 948.549467] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.549672] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.550298] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 948.550487] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.550698] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2149cb0f-2d99-482e-bd02-2eb73a77900a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.553118] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b30d906-bcee-4fd9-8e9e-839c32a1b0a5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.560581] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 948.560849] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9b00cbfb-1d76-4e5f-99be-cca9d328b7eb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.563127] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.563302] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 948.564247] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-030b3ed6-a30f-4ba5-8666-072e79b2f375 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.568955] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Waiting for the task: (returnval){ [ 948.568955] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52cc38de-4c74-1f9b-a61f-3a94de1f3a28" [ 948.568955] env[67270]: _type = "Task" [ 948.568955] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.576589] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52cc38de-4c74-1f9b-a61f-3a94de1f3a28, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.962434] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 948.962657] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Deleting contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 948.963222] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Deleting the datastore file [datastore2] 69980b41-9514-4d97-aa75-ea68dd05b241 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 948.963222] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8d4a36b0-14ba-4f68-9b4c-549c524e09a2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.970718] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Waiting for the task: (returnval){ [ 948.970718] env[67270]: value = "task-4110636" [ 948.970718] env[67270]: _type = "Task" [ 948.970718] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.980206] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Task: {'id': task-4110636, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 949.079482] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 949.079836] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Creating directory with path [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 949.079970] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3354e421-c171-4e50-8054-fc588843c9c0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.092815] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Created directory with path [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 949.092972] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Fetch image to [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 949.093114] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 949.094095] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-748f8ab7-521f-4ba9-a6ad-89569a39d070 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.101558] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c7a1d56-e4b9-4e4b-864f-e1e2daa4cacb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.113864] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-586949ef-3752-4d17-ae90-6c7184c3f870 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.143373] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e970133-4a4e-4bcc-ac74-03ed7b838fa0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.150353] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-25175ee9-a3a5-4fc9-8fdc-e1437ac566dd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.174140] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 949.222522] env[67270]: DEBUG oslo_vmware.rw_handles [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 949.284937] env[67270]: DEBUG oslo_vmware.rw_handles [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 949.285111] env[67270]: DEBUG oslo_vmware.rw_handles [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 949.482057] env[67270]: DEBUG oslo_vmware.api [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Task: {'id': task-4110636, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.089902} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 949.482260] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 949.482425] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Deleted contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 949.482602] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 949.482783] env[67270]: INFO nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Took 0.93 seconds to destroy the instance on the hypervisor. [ 949.485046] env[67270]: DEBUG nova.compute.claims [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 949.485232] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.485452] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.856816] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c06d03ea-f331-42ab-bcfa-8e312a139f4c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.865069] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdaf980b-e30a-4dd3-bade-bf304e400472 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.897581] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-523bfb12-a50c-4000-8b4e-e24bd126ad82 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.905936] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b29bb227-246f-4bf0-9392-8169e52ef688 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.920469] env[67270]: DEBUG nova.compute.provider_tree [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.929895] env[67270]: DEBUG nova.scheduler.client.report [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.944964] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.459s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.945537] env[67270]: ERROR nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 949.945537] env[67270]: Faults: ['InvalidArgument'] [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Traceback (most recent call last): [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self.driver.spawn(context, instance, image_meta, [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self._vmops.spawn(context, instance, image_meta, injected_files, [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self._fetch_image_if_missing(context, vi) [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] image_cache(vi, tmp_image_ds_loc) [ 949.945537] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] vm_util.copy_virtual_disk( [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] session._wait_for_task(vmdk_copy_task) [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return self.wait_for_task(task_ref) [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return evt.wait() [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] result = hub.switch() [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] return self.greenlet.switch() [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 949.945951] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] self.f(*self.args, **self.kw) [ 949.946371] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 949.946371] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] raise exceptions.translate_fault(task_info.error) [ 949.946371] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 949.946371] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Faults: ['InvalidArgument'] [ 949.946371] env[67270]: ERROR nova.compute.manager [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] [ 949.946371] env[67270]: DEBUG nova.compute.utils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 949.947754] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Build of instance 69980b41-9514-4d97-aa75-ea68dd05b241 was re-scheduled: A specified parameter was not correct: fileType [ 949.947754] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 949.948141] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 949.948319] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 949.948485] env[67270]: DEBUG nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 949.948643] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 950.343785] env[67270]: DEBUG nova.network.neutron [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.356178] env[67270]: INFO nova.compute.manager [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] [instance: 69980b41-9514-4d97-aa75-ea68dd05b241] Took 0.41 seconds to deallocate network for instance. [ 950.459390] env[67270]: INFO nova.scheduler.client.report [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Deleted allocations for instance 69980b41-9514-4d97-aa75-ea68dd05b241 [ 950.477817] env[67270]: DEBUG oslo_concurrency.lockutils [None req-299b3b6f-7d96-4190-bc2a-7f17b340084b tempest-AttachInterfacesTestJSON-1867053361 tempest-AttachInterfacesTestJSON-1867053361-project-member] Lock "69980b41-9514-4d97-aa75-ea68dd05b241" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 188.198s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.500593] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 950.552254] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.552511] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.554095] env[67270]: INFO nova.compute.claims [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 950.917965] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f60c8f6-e13a-4ab6-99d8-7876fe2554f3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.927180] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-769e8da1-6d3c-4e11-a13f-e8cccf0fa1e0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.957911] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab297c31-1dca-4254-bb8c-2926bdc6e3ce {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.965757] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11626c5-3ded-4c27-82f5-b664ec3363b3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.979013] env[67270]: DEBUG nova.compute.provider_tree [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 950.987737] env[67270]: DEBUG nova.scheduler.client.report [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 951.003914] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.451s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.004420] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 951.035762] env[67270]: DEBUG nova.compute.utils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 951.037535] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 951.037751] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 951.046505] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 951.132679] env[67270]: DEBUG nova.policy [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '62db616aa15c40228c8824758952ea85', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70512dc49d5f453ea28c742bbf9fd2d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 951.140412] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 951.165463] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 951.165722] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 951.165877] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 951.166081] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 951.166231] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 951.166512] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 951.166728] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 951.166901] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 951.167367] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 951.168634] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 951.168634] env[67270]: DEBUG nova.virt.hardware [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 951.168778] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca1df886-0d00-404f-b957-446108bae103 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.177944] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dc511bc-1826-47f6-8ce8-55010218c360 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.559462] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Successfully created port: b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 952.556675] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Successfully updated port: b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 952.572943] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.573278] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquired lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.573278] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 952.576173] env[67270]: DEBUG nova.compute.manager [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Received event network-vif-plugged-b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 952.576951] env[67270]: DEBUG oslo_concurrency.lockutils [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] Acquiring lock "87ef9733-e8d6-429e-b23f-8b8aadef784c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 952.576951] env[67270]: DEBUG oslo_concurrency.lockutils [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] Lock "87ef9733-e8d6-429e-b23f-8b8aadef784c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 952.576951] env[67270]: DEBUG oslo_concurrency.lockutils [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] Lock "87ef9733-e8d6-429e-b23f-8b8aadef784c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 952.576951] env[67270]: DEBUG nova.compute.manager [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] No waiting events found dispatching network-vif-plugged-b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 952.577651] env[67270]: WARNING nova.compute.manager [req-2c0d003d-90f7-48de-b0e6-6b3e98be4d57 req-7d5d96f6-5203-4d53-b4cc-a245423c59c0 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Received unexpected event network-vif-plugged-b839777a-f0a4-4ba5-8c3d-24fcdec902c7 for instance with vm_state building and task_state spawning. [ 952.627018] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 952.843812] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Updating instance_info_cache with network_info: [{"id": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "address": "fa:16:3e:48:a0:1e", "network": {"id": "4908c5f8-0f3c-4db0-ad1b-bbf988e3a6d6", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1113010113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70512dc49d5f453ea28c742bbf9fd2d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0234c8-1a2d-47ff-9a72-2e7d35b49214", "external-id": "nsx-vlan-transportzone-788", "segmentation_id": 788, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb839777a-f0", "ovs_interfaceid": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.861433] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Releasing lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 952.861728] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance network_info: |[{"id": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "address": "fa:16:3e:48:a0:1e", "network": {"id": "4908c5f8-0f3c-4db0-ad1b-bbf988e3a6d6", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1113010113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70512dc49d5f453ea28c742bbf9fd2d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0234c8-1a2d-47ff-9a72-2e7d35b49214", "external-id": "nsx-vlan-transportzone-788", "segmentation_id": 788, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb839777a-f0", "ovs_interfaceid": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 952.862135] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:a0:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da0234c8-1a2d-47ff-9a72-2e7d35b49214', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b839777a-f0a4-4ba5-8c3d-24fcdec902c7', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 952.869516] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Creating folder: Project (70512dc49d5f453ea28c742bbf9fd2d6). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 952.870097] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-efc5145e-d8db-4fd3-b472-927649785623 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.881574] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Created folder: Project (70512dc49d5f453ea28c742bbf9fd2d6) in parent group-v814248. [ 952.881761] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Creating folder: Instances. Parent ref: group-v814300. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 952.881991] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-87991c37-aba4-42a4-abf9-2c553e7c4d08 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.891473] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Created folder: Instances in parent group-v814300. [ 952.891574] env[67270]: DEBUG oslo.service.loopingcall [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 952.891732] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 952.891932] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-daff9fe7-053c-4ab4-a086-35166016fbbf {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.912136] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 952.912136] env[67270]: value = "task-4110639" [ 952.912136] env[67270]: _type = "Task" [ 952.912136] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.920370] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110639, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 953.422287] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110639, 'name': CreateVM_Task, 'duration_secs': 0.316888} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 953.422465] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 953.423132] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 953.423346] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 953.423668] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 953.423915] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-adfef7d2-bb1e-4a4b-b600-9d6ed240cddb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.428573] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Waiting for the task: (returnval){ [ 953.428573] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]522c8460-f776-2095-a8f0-24349a76cff2" [ 953.428573] env[67270]: _type = "Task" [ 953.428573] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 953.436870] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]522c8460-f776-2095-a8f0-24349a76cff2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 953.940541] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 953.941502] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 953.941502] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 954.656258] env[67270]: DEBUG nova.compute.manager [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Received event network-changed-b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 954.656608] env[67270]: DEBUG nova.compute.manager [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Refreshing instance network info cache due to event network-changed-b839777a-f0a4-4ba5-8c3d-24fcdec902c7. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 954.656690] env[67270]: DEBUG oslo_concurrency.lockutils [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] Acquiring lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 954.656815] env[67270]: DEBUG oslo_concurrency.lockutils [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] Acquired lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 954.656975] env[67270]: DEBUG nova.network.neutron [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Refreshing network info cache for port b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 954.963922] env[67270]: DEBUG nova.network.neutron [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Updated VIF entry in instance network info cache for port b839777a-f0a4-4ba5-8c3d-24fcdec902c7. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 954.964430] env[67270]: DEBUG nova.network.neutron [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Updating instance_info_cache with network_info: [{"id": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "address": "fa:16:3e:48:a0:1e", "network": {"id": "4908c5f8-0f3c-4db0-ad1b-bbf988e3a6d6", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1113010113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "70512dc49d5f453ea28c742bbf9fd2d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0234c8-1a2d-47ff-9a72-2e7d35b49214", "external-id": "nsx-vlan-transportzone-788", "segmentation_id": 788, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb839777a-f0", "ovs_interfaceid": "b839777a-f0a4-4ba5-8c3d-24fcdec902c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 954.974245] env[67270]: DEBUG oslo_concurrency.lockutils [req-7a0c395d-8dee-436f-9c23-9fa22b971974 req-455dbc4c-5dd8-43e8-84fc-e8550a245021 service nova] Releasing lock "refresh_cache-87ef9733-e8d6-429e-b23f-8b8aadef784c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 959.741788] env[67270]: DEBUG nova.compute.manager [req-8347b4f1-3d15-4d8a-a588-26c306fad776 req-a38fb3fe-3654-491a-b569-bd1d6ceef5c1 service nova] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Received event network-vif-deleted-1b6b43f7-23b2-4088-9933-ff0d804226e0 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 963.798358] env[67270]: DEBUG nova.compute.manager [req-809faf46-c208-4359-806d-ecfd27e1e981 req-c596e88f-d2cb-4c56-80f1-9588e4236452 service nova] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Received event network-vif-deleted-d9a92fea-305d-4d67-b157-6b8348821277 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 966.123443] env[67270]: DEBUG nova.compute.manager [req-b8851c1b-6d6f-4b81-9c2f-71015351e95c req-b62749e8-79c2-49c1-8776-23289026023c service nova] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Received event network-vif-deleted-b839777a-f0a4-4ba5-8c3d-24fcdec902c7 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 983.904752] env[67270]: WARNING oslo_vmware.rw_handles [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 983.904752] env[67270]: ERROR oslo_vmware.rw_handles [ 983.905273] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 983.907194] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 983.907453] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Copying Virtual Disk [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/743a8a9c-0128-4b9a-98fe-8458ac988edc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 983.907748] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-64050f15-b55d-40ff-87b0-377b519616f8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 983.919600] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for the task: (returnval){ [ 983.919600] env[67270]: value = "task-4110640" [ 983.919600] env[67270]: _type = "Task" [ 983.919600] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 983.932205] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Task: {'id': task-4110640, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 984.430647] env[67270]: DEBUG oslo_vmware.exceptions [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 984.430647] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 984.432367] env[67270]: ERROR nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 984.432367] env[67270]: Faults: ['InvalidArgument'] [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Traceback (most recent call last): [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] yield resources [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self.driver.spawn(context, instance, image_meta, [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self._fetch_image_if_missing(context, vi) [ 984.432367] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] image_cache(vi, tmp_image_ds_loc) [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] vm_util.copy_virtual_disk( [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] session._wait_for_task(vmdk_copy_task) [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return self.wait_for_task(task_ref) [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return evt.wait() [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] result = hub.switch() [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 984.432776] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return self.greenlet.switch() [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self.f(*self.args, **self.kw) [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] raise exceptions.translate_fault(task_info.error) [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Faults: ['InvalidArgument'] [ 984.433115] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] [ 984.433115] env[67270]: INFO nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Terminating instance [ 984.436299] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 984.436514] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 984.436764] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0d01afb-a4fe-496a-a2bd-e8431d622f8c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.440146] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 984.440344] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 984.441169] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2acdbaf4-9ae9-4486-b7de-0b58a1b95520 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.446752] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 984.446928] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 984.450052] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0b13fc8e-0160-4b98-8847-b094d602e93e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.452534] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 984.452966] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-eda697dd-a4b0-49ea-a50b-6594fd7287c1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.459642] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for the task: (returnval){ [ 984.459642] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5236bcc4-a8e1-f707-eeb4-1de99d87f493" [ 984.459642] env[67270]: _type = "Task" [ 984.459642] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 984.468948] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5236bcc4-a8e1-f707-eeb4-1de99d87f493, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 984.534028] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 984.534028] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 984.534028] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Deleting the datastore file [datastore1] c847f4cb-1914-497b-8d63-5b99a237e5e6 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 984.534028] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dfb52e7a-c52b-453f-8f71-9fe4a4f234c6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.541191] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for the task: (returnval){ [ 984.541191] env[67270]: value = "task-4110642" [ 984.541191] env[67270]: _type = "Task" [ 984.541191] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 984.551287] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Task: {'id': task-4110642, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 984.977192] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 984.977192] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Creating directory with path [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 984.977192] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b2b9651-d1ae-4c93-a0b3-5f5d78c6031d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.992163] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Created directory with path [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 984.992429] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Fetch image to [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 984.992709] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 984.993608] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09441abe-8e79-472d-96ed-e3b2725a6076 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.007657] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66c4e8ad-3aee-4e57-afd6-3f2b46405c27 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.020219] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e2ad5f7-16ae-4983-87dd-c66005482d5e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.060512] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e22673-4cfc-4674-bf22-4ef7e3c45c61 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.068121] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa1fb59b-b820-4e10-ac39-b8b81c464fc6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.073038] env[67270]: DEBUG oslo_vmware.api [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Task: {'id': task-4110642, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084706} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 985.073977] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 985.073977] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 985.074309] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 985.074561] env[67270]: INFO nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Took 0.63 seconds to destroy the instance on the hypervisor. [ 985.079247] env[67270]: DEBUG nova.compute.claims [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 985.079506] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 985.079738] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 985.106787] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 985.264603] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22df98b2-4840-4719-b050-3d23bb365759 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.274539] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cde6767-1c21-4a64-a220-6a5a755138d4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.326348] env[67270]: DEBUG oslo_vmware.rw_handles [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 985.326348] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e56cdabb-6242-4ac5-8bca-d4e2584ebd26 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.389132] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5822cb56-cdf3-481f-a051-0453858c2aa1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 985.396388] env[67270]: DEBUG oslo_vmware.rw_handles [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 985.396388] env[67270]: DEBUG oslo_vmware.rw_handles [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 985.411074] env[67270]: DEBUG nova.compute.provider_tree [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 985.421370] env[67270]: DEBUG nova.scheduler.client.report [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 985.439390] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 985.439978] env[67270]: ERROR nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 985.439978] env[67270]: Faults: ['InvalidArgument'] [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Traceback (most recent call last): [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self.driver.spawn(context, instance, image_meta, [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self._fetch_image_if_missing(context, vi) [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] image_cache(vi, tmp_image_ds_loc) [ 985.439978] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] vm_util.copy_virtual_disk( [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] session._wait_for_task(vmdk_copy_task) [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return self.wait_for_task(task_ref) [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return evt.wait() [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] result = hub.switch() [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] return self.greenlet.switch() [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 985.440464] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] self.f(*self.args, **self.kw) [ 985.440860] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 985.440860] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] raise exceptions.translate_fault(task_info.error) [ 985.440860] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 985.440860] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Faults: ['InvalidArgument'] [ 985.440860] env[67270]: ERROR nova.compute.manager [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] [ 985.440860] env[67270]: DEBUG nova.compute.utils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 985.442859] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Build of instance c847f4cb-1914-497b-8d63-5b99a237e5e6 was re-scheduled: A specified parameter was not correct: fileType [ 985.442859] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 985.443266] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 985.443438] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 985.443598] env[67270]: DEBUG nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 985.443758] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 986.952491] env[67270]: DEBUG nova.network.neutron [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 986.964983] env[67270]: INFO nova.compute.manager [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Took 1.52 seconds to deallocate network for instance. [ 987.082972] env[67270]: INFO nova.scheduler.client.report [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Deleted allocations for instance c847f4cb-1914-497b-8d63-5b99a237e5e6 [ 987.103256] env[67270]: DEBUG oslo_concurrency.lockutils [None req-120c9ac9-ed6e-494e-9642-af62484f77bf tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 303.896s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.105756] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 105.233s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 987.106691] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Acquiring lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 987.106945] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 987.107186] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.112406] env[67270]: INFO nova.compute.manager [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Terminating instance [ 987.118341] env[67270]: DEBUG nova.compute.manager [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 987.119409] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 987.119984] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e9cfce5c-fc4a-4a65-9d81-d1c8cd4dcee7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.125975] env[67270]: DEBUG nova.compute.manager [None req-f4fd7ac3-4900-4e59-88d7-2523d6dd78ec tempest-ServerPasswordTestJSON-958540676 tempest-ServerPasswordTestJSON-958540676-project-member] [instance: 2f050e13-5621-4dda-ade1-cfbef017e57e] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.138212] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-326eff31-1ca3-4337-a764-39dd04df2dfb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.162965] env[67270]: DEBUG nova.compute.manager [None req-f4fd7ac3-4900-4e59-88d7-2523d6dd78ec tempest-ServerPasswordTestJSON-958540676 tempest-ServerPasswordTestJSON-958540676-project-member] [instance: 2f050e13-5621-4dda-ade1-cfbef017e57e] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.179056] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c847f4cb-1914-497b-8d63-5b99a237e5e6 could not be found. [ 987.179395] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 987.179646] env[67270]: INFO nova.compute.manager [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Took 0.06 seconds to destroy the instance on the hypervisor. [ 987.179919] env[67270]: DEBUG oslo.service.loopingcall [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 987.180235] env[67270]: DEBUG nova.compute.manager [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 987.180334] env[67270]: DEBUG nova.network.neutron [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 987.199618] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f4fd7ac3-4900-4e59-88d7-2523d6dd78ec tempest-ServerPasswordTestJSON-958540676 tempest-ServerPasswordTestJSON-958540676-project-member] Lock "2f050e13-5621-4dda-ade1-cfbef017e57e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.311s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.214262] env[67270]: DEBUG nova.compute.manager [None req-71d402b3-595b-487e-b083-c7ea072f93d0 tempest-ServersNegativeTestJSON-834909547 tempest-ServersNegativeTestJSON-834909547-project-member] [instance: 4a1a791f-36f3-48af-9792-4a9eaeba26c9] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.240097] env[67270]: DEBUG nova.compute.manager [None req-71d402b3-595b-487e-b083-c7ea072f93d0 tempest-ServersNegativeTestJSON-834909547 tempest-ServersNegativeTestJSON-834909547-project-member] [instance: 4a1a791f-36f3-48af-9792-4a9eaeba26c9] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.263127] env[67270]: DEBUG oslo_concurrency.lockutils [None req-71d402b3-595b-487e-b083-c7ea072f93d0 tempest-ServersNegativeTestJSON-834909547 tempest-ServersNegativeTestJSON-834909547-project-member] Lock "4a1a791f-36f3-48af-9792-4a9eaeba26c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.365s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.276873] env[67270]: DEBUG nova.compute.manager [None req-02016afc-c690-44b1-ae7f-e4f0679a6a37 tempest-ServerTagsTestJSON-710292736 tempest-ServerTagsTestJSON-710292736-project-member] [instance: 907dfc72-e766-4a24-a4e7-df762db37824] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.307309] env[67270]: DEBUG nova.compute.manager [None req-02016afc-c690-44b1-ae7f-e4f0679a6a37 tempest-ServerTagsTestJSON-710292736 tempest-ServerTagsTestJSON-710292736-project-member] [instance: 907dfc72-e766-4a24-a4e7-df762db37824] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.340124] env[67270]: DEBUG oslo_concurrency.lockutils [None req-02016afc-c690-44b1-ae7f-e4f0679a6a37 tempest-ServerTagsTestJSON-710292736 tempest-ServerTagsTestJSON-710292736-project-member] Lock "907dfc72-e766-4a24-a4e7-df762db37824" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.002s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.353584] env[67270]: DEBUG nova.compute.manager [None req-2496d48d-9e7c-41af-ac0c-261720e759a6 tempest-ImagesOneServerNegativeTestJSON-359832043 tempest-ImagesOneServerNegativeTestJSON-359832043-project-member] [instance: cbe3ecc4-3c5b-4749-a21c-c0376583c4aa] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.389761] env[67270]: DEBUG nova.compute.manager [None req-2496d48d-9e7c-41af-ac0c-261720e759a6 tempest-ImagesOneServerNegativeTestJSON-359832043 tempest-ImagesOneServerNegativeTestJSON-359832043-project-member] [instance: cbe3ecc4-3c5b-4749-a21c-c0376583c4aa] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.424171] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2496d48d-9e7c-41af-ac0c-261720e759a6 tempest-ImagesOneServerNegativeTestJSON-359832043 tempest-ImagesOneServerNegativeTestJSON-359832043-project-member] Lock "cbe3ecc4-3c5b-4749-a21c-c0376583c4aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.464s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.440253] env[67270]: DEBUG nova.compute.manager [None req-94475fdc-a8d0-4259-961d-5e3d6a6a61b6 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] [instance: 6546bb93-d032-4b32-b42f-49bbf36b8e82] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.475769] env[67270]: DEBUG nova.compute.manager [None req-94475fdc-a8d0-4259-961d-5e3d6a6a61b6 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] [instance: 6546bb93-d032-4b32-b42f-49bbf36b8e82] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.505586] env[67270]: DEBUG oslo_concurrency.lockutils [None req-94475fdc-a8d0-4259-961d-5e3d6a6a61b6 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Lock "6546bb93-d032-4b32-b42f-49bbf36b8e82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.665s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.518650] env[67270]: DEBUG nova.compute.manager [None req-f9750147-a710-49b3-96cf-b0b3248c9e82 tempest-VolumesAdminNegativeTest-1789479060 tempest-VolumesAdminNegativeTest-1789479060-project-member] [instance: f42f9cc0-c33a-4bdc-b16c-8dec61896b27] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.527249] env[67270]: DEBUG nova.network.neutron [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 987.541125] env[67270]: INFO nova.compute.manager [-] [instance: c847f4cb-1914-497b-8d63-5b99a237e5e6] Took 0.36 seconds to deallocate network for instance. [ 987.560939] env[67270]: DEBUG nova.compute.manager [None req-f9750147-a710-49b3-96cf-b0b3248c9e82 tempest-VolumesAdminNegativeTest-1789479060 tempest-VolumesAdminNegativeTest-1789479060-project-member] [instance: f42f9cc0-c33a-4bdc-b16c-8dec61896b27] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.589037] env[67270]: DEBUG oslo_concurrency.lockutils [None req-f9750147-a710-49b3-96cf-b0b3248c9e82 tempest-VolumesAdminNegativeTest-1789479060 tempest-VolumesAdminNegativeTest-1789479060-project-member] Lock "f42f9cc0-c33a-4bdc-b16c-8dec61896b27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.847s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.604625] env[67270]: DEBUG nova.compute.manager [None req-e6ac2fbe-ce0b-436e-8245-b1b738d351c2 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] [instance: 2a6c8de3-8974-4533-a474-c4242fd735c6] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.639132] env[67270]: DEBUG nova.compute.manager [None req-e6ac2fbe-ce0b-436e-8245-b1b738d351c2 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] [instance: 2a6c8de3-8974-4533-a474-c4242fd735c6] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.669066] env[67270]: DEBUG oslo_concurrency.lockutils [None req-89b8ab9f-9dcd-4aa3-81a2-c7e291ea1f86 tempest-ServersAdminNegativeTestJSON-2065954721 tempest-ServersAdminNegativeTestJSON-2065954721-project-member] Lock "c847f4cb-1914-497b-8d63-5b99a237e5e6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.564s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.676385] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6ac2fbe-ce0b-436e-8245-b1b738d351c2 tempest-DeleteServersTestJSON-2013465335 tempest-DeleteServersTestJSON-2013465335-project-member] Lock "2a6c8de3-8974-4533-a474-c4242fd735c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.344s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.688452] env[67270]: DEBUG nova.compute.manager [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] [instance: 4e53a7b7-7194-4ceb-abef-5d0779effbfb] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.715697] env[67270]: DEBUG nova.compute.manager [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] [instance: 4e53a7b7-7194-4ceb-abef-5d0779effbfb] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.753168] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Lock "4e53a7b7-7194-4ceb-abef-5d0779effbfb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.339s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.771542] env[67270]: DEBUG nova.compute.manager [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] [instance: 4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.806409] env[67270]: DEBUG nova.compute.manager [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] [instance: 4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.841162] env[67270]: DEBUG oslo_concurrency.lockutils [None req-a191e8b9-4f6a-4284-a10e-d1aeb80442ad tempest-MultipleCreateTestJSON-31533564 tempest-MultipleCreateTestJSON-31533564-project-member] Lock "4c9dbddd-4c74-4ee0-a1be-e7a5c7cfc344" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.388s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.852963] env[67270]: DEBUG nova.compute.manager [None req-7ea8e4d8-434a-4dbd-971a-6a8af1221e03 tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] [instance: ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.881237] env[67270]: DEBUG nova.compute.manager [None req-7ea8e4d8-434a-4dbd-971a-6a8af1221e03 tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] [instance: ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.905892] env[67270]: DEBUG oslo_concurrency.lockutils [None req-7ea8e4d8-434a-4dbd-971a-6a8af1221e03 tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Lock "ee08ac0e-d7fb-4f36-962b-cb8b88bf6bb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.680s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 987.918594] env[67270]: DEBUG nova.compute.manager [None req-2d09057f-9b62-40d7-8664-1194617e51eb tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] [instance: a9aaa31c-5228-4210-b3c0-ca8c5a8c6213] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 987.961433] env[67270]: DEBUG nova.compute.manager [None req-2d09057f-9b62-40d7-8664-1194617e51eb tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] [instance: a9aaa31c-5228-4210-b3c0-ca8c5a8c6213] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 987.990914] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2d09057f-9b62-40d7-8664-1194617e51eb tempest-ServerRescueNegativeTestJSON-1936964094 tempest-ServerRescueNegativeTestJSON-1936964094-project-member] Lock "a9aaa31c-5228-4210-b3c0-ca8c5a8c6213" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.006s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.010103] env[67270]: DEBUG nova.compute.manager [None req-cba3a315-e09d-4ecf-9e44-4a01627e2758 tempest-ServersTestManualDisk-516412660 tempest-ServersTestManualDisk-516412660-project-member] [instance: 4dce8f09-ce7e-419c-90b4-48ee54d8c604] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.043354] env[67270]: DEBUG nova.compute.manager [None req-cba3a315-e09d-4ecf-9e44-4a01627e2758 tempest-ServersTestManualDisk-516412660 tempest-ServersTestManualDisk-516412660-project-member] [instance: 4dce8f09-ce7e-419c-90b4-48ee54d8c604] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.085749] env[67270]: DEBUG oslo_concurrency.lockutils [None req-cba3a315-e09d-4ecf-9e44-4a01627e2758 tempest-ServersTestManualDisk-516412660 tempest-ServersTestManualDisk-516412660-project-member] Lock "4dce8f09-ce7e-419c-90b4-48ee54d8c604" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.322s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.098387] env[67270]: DEBUG nova.compute.manager [None req-93251977-26dd-41aa-b8e3-b10604cd7e16 tempest-ServerActionsTestOtherA-1897363833 tempest-ServerActionsTestOtherA-1897363833-project-member] [instance: c372287f-35e3-402a-9841-6f55ea471d3d] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.128253] env[67270]: DEBUG nova.compute.manager [None req-93251977-26dd-41aa-b8e3-b10604cd7e16 tempest-ServerActionsTestOtherA-1897363833 tempest-ServerActionsTestOtherA-1897363833-project-member] [instance: c372287f-35e3-402a-9841-6f55ea471d3d] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.163354] env[67270]: DEBUG oslo_concurrency.lockutils [None req-93251977-26dd-41aa-b8e3-b10604cd7e16 tempest-ServerActionsTestOtherA-1897363833 tempest-ServerActionsTestOtherA-1897363833-project-member] Lock "c372287f-35e3-402a-9841-6f55ea471d3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.618s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.179378] env[67270]: DEBUG nova.compute.manager [None req-ad659196-710f-478e-a478-1981e05dc130 tempest-ServerShowV254Test-191532395 tempest-ServerShowV254Test-191532395-project-member] [instance: 2de2d5d9-2644-408a-8957-2c169b2793ce] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.216764] env[67270]: DEBUG nova.compute.manager [None req-ad659196-710f-478e-a478-1981e05dc130 tempest-ServerShowV254Test-191532395 tempest-ServerShowV254Test-191532395-project-member] [instance: 2de2d5d9-2644-408a-8957-2c169b2793ce] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.259055] env[67270]: DEBUG oslo_concurrency.lockutils [None req-ad659196-710f-478e-a478-1981e05dc130 tempest-ServerShowV254Test-191532395 tempest-ServerShowV254Test-191532395-project-member] Lock "2de2d5d9-2644-408a-8957-2c169b2793ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.332s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.272611] env[67270]: DEBUG nova.compute.manager [None req-3d7e1c0a-a54e-436f-8bdb-6cc2c6f2b3fb tempest-ServerMetadataNegativeTestJSON-173231563 tempest-ServerMetadataNegativeTestJSON-173231563-project-member] [instance: 65422c06-b1cf-4868-8f38-391b08038fc9] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.308392] env[67270]: DEBUG nova.compute.manager [None req-3d7e1c0a-a54e-436f-8bdb-6cc2c6f2b3fb tempest-ServerMetadataNegativeTestJSON-173231563 tempest-ServerMetadataNegativeTestJSON-173231563-project-member] [instance: 65422c06-b1cf-4868-8f38-391b08038fc9] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.340017] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d7e1c0a-a54e-436f-8bdb-6cc2c6f2b3fb tempest-ServerMetadataNegativeTestJSON-173231563 tempest-ServerMetadataNegativeTestJSON-173231563-project-member] Lock "65422c06-b1cf-4868-8f38-391b08038fc9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.119s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.352328] env[67270]: DEBUG nova.compute.manager [None req-810fdb95-eb3e-4084-9138-9f2aed01baef tempest-ServerRescueTestJSONUnderV235-461983559 tempest-ServerRescueTestJSONUnderV235-461983559-project-member] [instance: e976fd9e-95a3-4564-9bd6-08ee3f15a188] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.388655] env[67270]: DEBUG nova.compute.manager [None req-810fdb95-eb3e-4084-9138-9f2aed01baef tempest-ServerRescueTestJSONUnderV235-461983559 tempest-ServerRescueTestJSONUnderV235-461983559-project-member] [instance: e976fd9e-95a3-4564-9bd6-08ee3f15a188] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.421376] env[67270]: DEBUG oslo_concurrency.lockutils [None req-810fdb95-eb3e-4084-9138-9f2aed01baef tempest-ServerRescueTestJSONUnderV235-461983559 tempest-ServerRescueTestJSONUnderV235-461983559-project-member] Lock "e976fd9e-95a3-4564-9bd6-08ee3f15a188" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.481s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.435745] env[67270]: DEBUG nova.compute.manager [None req-2b1d5d17-3319-4ea8-bc94-48cdb7ad94d1 tempest-ServersV294TestFqdnHostnames-831052524 tempest-ServersV294TestFqdnHostnames-831052524-project-member] [instance: 662bb470-e6ed-4a37-bb23-74a0a36dff0c] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.464889] env[67270]: DEBUG nova.compute.manager [None req-2b1d5d17-3319-4ea8-bc94-48cdb7ad94d1 tempest-ServersV294TestFqdnHostnames-831052524 tempest-ServersV294TestFqdnHostnames-831052524-project-member] [instance: 662bb470-e6ed-4a37-bb23-74a0a36dff0c] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.497230] env[67270]: DEBUG oslo_concurrency.lockutils [None req-2b1d5d17-3319-4ea8-bc94-48cdb7ad94d1 tempest-ServersV294TestFqdnHostnames-831052524 tempest-ServersV294TestFqdnHostnames-831052524-project-member] Lock "662bb470-e6ed-4a37-bb23-74a0a36dff0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.787s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.518894] env[67270]: DEBUG nova.compute.manager [None req-fe63a5af-f7d2-4520-b9c3-1fdc47f0f886 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: 25cc189a-383b-450c-810d-85ea2b48fdca] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 988.552434] env[67270]: DEBUG nova.compute.manager [None req-fe63a5af-f7d2-4520-b9c3-1fdc47f0f886 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] [instance: 25cc189a-383b-450c-810d-85ea2b48fdca] Instance disappeared before build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 988.582371] env[67270]: DEBUG oslo_concurrency.lockutils [None req-fe63a5af-f7d2-4520-b9c3-1fdc47f0f886 tempest-DeleteServersAdminTestJSON-1214325291 tempest-DeleteServersAdminTestJSON-1214325291-project-member] Lock "25cc189a-383b-450c-810d-85ea2b48fdca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.398s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 993.797294] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 993.797656] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 993.797656] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 993.825043] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 993.825043] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 993.825043] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 993.825043] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 993.825043] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 993.827021] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 993.827179] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 993.827335] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.758532] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.758812] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.758940] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 995.755064] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.755322] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.779551] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.779551] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.792253] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.792531] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.792712] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.792874] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 995.794255] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b43deba5-557f-4e90-995f-c96719c68019 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.806269] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fb7ef8b-6928-4b1c-9eef-dfe12ddb1a9d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.824845] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4868fffe-b5b7-4abc-8d54-8a8052905889 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.833996] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ef8f38-118c-4a37-b12a-b39113a2539e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.872277] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180622MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 995.872467] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.872664] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.938804] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.938963] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.939145] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.941068] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.941799] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 995.941799] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 996.032739] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf1ee4be-6bf4-4e66-8a66-d93e5e527093 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.043020] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4760d31d-4e53-4955-b1d6-907e9fd7dce0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.075288] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c03a5d77-ab7d-4dbf-aec9-ed1a1f05223e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.084256] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b87ef4-5dfc-418d-8de9-29c34abbf329 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.098700] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 996.108580] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 996.124889] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 996.124889] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.044096] env[67270]: WARNING oslo_vmware.rw_handles [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 998.044096] env[67270]: ERROR oslo_vmware.rw_handles [ 998.044714] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 998.047702] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 998.047854] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Copying Virtual Disk [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore2] vmware_temp/b3f76792-025b-4ef5-af1f-79365efc20c8/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 998.048155] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3fda8307-31b9-46e9-9c31-a5d89541c749 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.059261] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Waiting for the task: (returnval){ [ 998.059261] env[67270]: value = "task-4110643" [ 998.059261] env[67270]: _type = "Task" [ 998.059261] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 998.069223] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Task: {'id': task-4110643, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 998.579112] env[67270]: DEBUG oslo_vmware.exceptions [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 998.579604] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 998.580184] env[67270]: ERROR nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.580184] env[67270]: Faults: ['InvalidArgument'] [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Traceback (most recent call last): [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] yield resources [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] self.driver.spawn(context, instance, image_meta, [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] self._fetch_image_if_missing(context, vi) [ 998.580184] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] image_cache(vi, tmp_image_ds_loc) [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] vm_util.copy_virtual_disk( [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] session._wait_for_task(vmdk_copy_task) [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] return self.wait_for_task(task_ref) [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] return evt.wait() [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] result = hub.switch() [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 998.581535] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] return self.greenlet.switch() [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] self.f(*self.args, **self.kw) [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] raise exceptions.translate_fault(task_info.error) [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Faults: ['InvalidArgument'] [ 998.582030] env[67270]: ERROR nova.compute.manager [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] [ 998.582030] env[67270]: INFO nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Terminating instance [ 998.582924] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 998.583151] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 998.583790] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 998.583981] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 998.584218] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0cb078b4-a2f5-4638-9895-0f18b8e5ce9d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.587048] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3804cfd8-2571-4c63-90ac-6eb46d808413 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.594876] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 998.595147] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a63a2020-ed81-4411-a5ae-1fd916efe6a5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.597604] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 998.597776] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 998.598799] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f9b5c93d-9dd3-4505-abe9-fff3bae8a61b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.604416] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Waiting for the task: (returnval){ [ 998.604416] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5252a631-b30a-ee20-5647-d387495fcae0" [ 998.604416] env[67270]: _type = "Task" [ 998.604416] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 998.613178] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5252a631-b30a-ee20-5647-d387495fcae0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 998.676694] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 998.679506] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Deleting contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 998.679659] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Deleting the datastore file [datastore2] 49292f00-1457-438b-b5b7-2ac35dd464d2 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 998.680030] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-22246280-008b-438e-98ad-4d8b81a3f856 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.687020] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Waiting for the task: (returnval){ [ 998.687020] env[67270]: value = "task-4110645" [ 998.687020] env[67270]: _type = "Task" [ 998.687020] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 998.695719] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Task: {'id': task-4110645, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 999.119573] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 999.119854] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Creating directory with path [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 999.120108] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7259cf36-4db2-4523-9ef8-c2026344f818 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.137205] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Created directory with path [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 999.137205] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Fetch image to [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 999.137205] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 999.137886] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6aac6c8-09be-47b7-a4a4-3c0c337f2ac1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.146154] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd54efd9-7af2-4b47-ae8d-b5de4056328d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.160550] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87714cab-6b03-47d6-9533-234a9284f867 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.205884] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fde8642-f4fa-49df-8df5-b8d362650f22 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.216780] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7dee60c0-958f-42a9-99b8-4bafdeed458d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.218746] env[67270]: DEBUG oslo_vmware.api [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Task: {'id': task-4110645, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081246} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 999.218988] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 999.219164] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Deleted contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 999.219331] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 999.219532] env[67270]: INFO nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Took 0.64 seconds to destroy the instance on the hypervisor. [ 999.221758] env[67270]: DEBUG nova.compute.claims [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 999.221906] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 999.222252] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 999.247023] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 999.250862] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 999.251814] env[67270]: DEBUG nova.compute.utils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance 49292f00-1457-438b-b5b7-2ac35dd464d2 could not be found. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 999.253655] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance disappeared during build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 999.253936] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 999.256621] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 999.256621] env[67270]: DEBUG nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 999.256621] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 999.288497] env[67270]: DEBUG nova.network.neutron [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.307767] env[67270]: INFO nova.compute.manager [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Took 0.05 seconds to deallocate network for instance. [ 999.320241] env[67270]: DEBUG oslo_vmware.rw_handles [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 999.393758] env[67270]: DEBUG oslo_vmware.rw_handles [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 999.394756] env[67270]: DEBUG oslo_vmware.rw_handles [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 999.418974] env[67270]: DEBUG oslo_concurrency.lockutils [None req-b1d9e9b9-a8e4-4ced-8c4b-90ede5b33154 tempest-ServerDiskConfigTestJSON-1301518846 tempest-ServerDiskConfigTestJSON-1301518846-project-member] Lock "49292f00-1457-438b-b5b7-2ac35dd464d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.375s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1000.068205] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1000.068935] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1000.086539] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1000.143020] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1000.143020] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1000.143906] env[67270]: INFO nova.compute.claims [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1000.311515] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ddeb727-8b9e-44c8-bc9c-c2e3a09aea3c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.321765] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58e9d604-c96a-45ab-8bdf-f56ca7ce0bd9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.355519] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59b411e3-8cdb-46d5-88c6-b561adeddb57 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.368159] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0696a34-66ea-428e-8176-46cb04155d27 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.384592] env[67270]: DEBUG nova.compute.provider_tree [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1000.396283] env[67270]: DEBUG nova.scheduler.client.report [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1000.424254] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1000.424510] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1000.473024] env[67270]: DEBUG nova.compute.utils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1000.473024] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1000.473024] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1000.487443] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1000.570164] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1000.576117] env[67270]: DEBUG nova.policy [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd795e36ee0da4d0688dbe42aeb35886b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd69d9ce23208455a8cd937393f47a81a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 1000.598720] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1000.598720] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1000.598720] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1000.598965] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1000.599141] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1000.599368] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1000.599521] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1000.599685] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1000.599933] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1000.600151] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1000.600332] env[67270]: DEBUG nova.virt.hardware [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1000.601762] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f64428-e20f-4b22-9d31-bd8f61e67217 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.612017] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbe4dfea-bf4c-4ae4-a510-cfe9aadbb28c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.101284] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Successfully created port: 92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1002.412049] env[67270]: DEBUG nova.compute.manager [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Received event network-vif-plugged-92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1002.412315] env[67270]: DEBUG oslo_concurrency.lockutils [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] Acquiring lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.412973] env[67270]: DEBUG oslo_concurrency.lockutils [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] Lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.413224] env[67270]: DEBUG oslo_concurrency.lockutils [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] Lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1002.413416] env[67270]: DEBUG nova.compute.manager [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] No waiting events found dispatching network-vif-plugged-92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1002.413785] env[67270]: WARNING nova.compute.manager [req-0924a9df-7fe4-4bd9-ac26-eb3ad95b4a8e req-bc38e4d7-b4b8-406c-871a-c3afd6ab6621 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Received unexpected event network-vif-plugged-92ea4d92-5781-40be-91ee-673bfcbf4aeb for instance with vm_state building and task_state spawning. [ 1002.638906] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "39ead031-10c5-40e3-ba91-9d34334398f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.639239] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "39ead031-10c5-40e3-ba91-9d34334398f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.651740] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1002.708256] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Successfully updated port: 92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1002.724888] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.724888] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.725784] env[67270]: INFO nova.compute.claims [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1002.732395] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1002.732610] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquired lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1002.733060] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1002.814720] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1002.894719] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72eaba5f-095d-495c-bb09-bd16233ff026 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.903453] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-244a0c8c-af9f-432d-9af9-6f4c4a6ebd46 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.940165] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f56910f-c09b-448d-9d1a-394871d7b5a9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.955093] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f75ad948-d322-4330-afb4-2775d788bc2d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.973159] env[67270]: DEBUG nova.compute.provider_tree [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1002.983567] env[67270]: DEBUG nova.scheduler.client.report [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1002.999565] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.000050] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1003.040118] env[67270]: DEBUG nova.compute.utils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1003.041032] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Not allocating networking since 'none' was specified. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 1003.052430] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1003.130843] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1003.157650] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1003.157911] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1003.158083] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1003.158315] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1003.158494] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1003.158669] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1003.158899] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1003.159088] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1003.159290] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1003.159464] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1003.159637] env[67270]: DEBUG nova.virt.hardware [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1003.160645] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c590fa31-b760-4e13-9d17-a35a09ad9df1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.170638] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d9e7481-ac96-4ad1-b72a-1aa1abfde5a1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.185814] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Instance VIF info [] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1003.193112] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Creating folder: Project (515f1801f63048288d9f10988695429e). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.195569] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-16ad592b-eed7-49e8-8657-dfb32085df02 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.209505] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Created folder: Project (515f1801f63048288d9f10988695429e) in parent group-v814248. [ 1003.209505] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Creating folder: Instances. Parent ref: group-v814306. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.209505] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee119fff-95b7-47b6-8443-69e227c1ee79 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.222349] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Created folder: Instances in parent group-v814306. [ 1003.222621] env[67270]: DEBUG oslo.service.loopingcall [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1003.222822] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1003.223061] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2ca604a7-76da-4a64-b530-5b3d1aa25f67 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.241341] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1003.241341] env[67270]: value = "task-4110651" [ 1003.241341] env[67270]: _type = "Task" [ 1003.241341] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.251582] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110651, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.615390] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Updating instance_info_cache with network_info: [{"id": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "address": "fa:16:3e:09:00:d8", "network": {"id": "d2b8e6b2-0ab0-4913-88e7-0cf9c6fd8c4b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1942442557-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d69d9ce23208455a8cd937393f47a81a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92ea4d92-57", "ovs_interfaceid": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1003.630613] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Releasing lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1003.630917] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Instance network_info: |[{"id": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "address": "fa:16:3e:09:00:d8", "network": {"id": "d2b8e6b2-0ab0-4913-88e7-0cf9c6fd8c4b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1942442557-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d69d9ce23208455a8cd937393f47a81a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92ea4d92-57", "ovs_interfaceid": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1003.631609] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:09:00:d8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '39a4aca0-934b-4a91-8779-6a4360c3f967', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '92ea4d92-5781-40be-91ee-673bfcbf4aeb', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1003.639779] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Creating folder: Project (d69d9ce23208455a8cd937393f47a81a). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.640479] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ad92558-572c-40f1-aa02-ce0865d8da0b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.654864] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Created folder: Project (d69d9ce23208455a8cd937393f47a81a) in parent group-v814248. [ 1003.655074] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Creating folder: Instances. Parent ref: group-v814309. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.655309] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3e1bb5d0-33e1-45e7-812a-8bcea60c918e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.665735] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Created folder: Instances in parent group-v814309. [ 1003.666009] env[67270]: DEBUG oslo.service.loopingcall [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1003.666212] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1003.666424] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dc972a2a-2c7d-4c24-bce0-ba5797aacfac {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.688244] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1003.688244] env[67270]: value = "task-4110655" [ 1003.688244] env[67270]: _type = "Task" [ 1003.688244] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.696995] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.751956] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110651, 'name': CreateVM_Task, 'duration_secs': 0.374111} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1003.751956] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1003.753138] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1003.753138] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.753138] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1003.753138] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-53793e71-2f40-4c84-b71a-c5fae0ebd0bc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.758754] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for the task: (returnval){ [ 1003.758754] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5241a628-79d7-297f-6fc9-ae8588c452bd" [ 1003.758754] env[67270]: _type = "Task" [ 1003.758754] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.770615] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5241a628-79d7-297f-6fc9-ae8588c452bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1004.199959] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1004.268727] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1004.269047] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1004.269227] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1004.443411] env[67270]: DEBUG nova.compute.manager [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Received event network-changed-92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1004.444029] env[67270]: DEBUG nova.compute.manager [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Refreshing instance network info cache due to event network-changed-92ea4d92-5781-40be-91ee-673bfcbf4aeb. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1004.444029] env[67270]: DEBUG oslo_concurrency.lockutils [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] Acquiring lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1004.444362] env[67270]: DEBUG oslo_concurrency.lockutils [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] Acquired lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1004.444609] env[67270]: DEBUG nova.network.neutron [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Refreshing network info cache for port 92ea4d92-5781-40be-91ee-673bfcbf4aeb {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1004.700139] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1004.704809] env[67270]: DEBUG nova.network.neutron [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Updated VIF entry in instance network info cache for port 92ea4d92-5781-40be-91ee-673bfcbf4aeb. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1004.705191] env[67270]: DEBUG nova.network.neutron [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Updating instance_info_cache with network_info: [{"id": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "address": "fa:16:3e:09:00:d8", "network": {"id": "d2b8e6b2-0ab0-4913-88e7-0cf9c6fd8c4b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1942442557-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d69d9ce23208455a8cd937393f47a81a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "39a4aca0-934b-4a91-8779-6a4360c3f967", "external-id": "nsx-vlan-transportzone-454", "segmentation_id": 454, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap92ea4d92-57", "ovs_interfaceid": "92ea4d92-5781-40be-91ee-673bfcbf4aeb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1004.714297] env[67270]: DEBUG oslo_concurrency.lockutils [req-eded298c-e417-42ce-bd8e-60de14905f9d req-0e6eae38-3061-4e8d-8c7b-1c013be35ff5 service nova] Releasing lock "refresh_cache-972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1005.201314] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1005.701490] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.201906] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.703436] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.202893] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.705214] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task} progress is 25%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1008.205202] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110655, 'name': CreateVM_Task, 'duration_secs': 4.207058} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1008.205375] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1008.206059] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.206221] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1008.206531] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1008.206790] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6096037b-51c1-45b3-9bc3-9bb58668591e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.211817] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Waiting for the task: (returnval){ [ 1008.211817] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52663cbb-968a-830e-e3a1-9cb97f39cd30" [ 1008.211817] env[67270]: _type = "Task" [ 1008.211817] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1008.219993] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52663cbb-968a-830e-e3a1-9cb97f39cd30, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1008.723887] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1008.724273] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1008.724387] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1019.473982] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "3273613a-db47-4af9-b3a5-d0dedffd3332" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1019.474365] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1031.438142] env[67270]: WARNING oslo_vmware.rw_handles [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1031.438142] env[67270]: ERROR oslo_vmware.rw_handles [ 1031.438753] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1031.441517] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1031.441800] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Copying Virtual Disk [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/0f9da5b7-3194-4647-9bbd-4524f622fe59/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1031.442120] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2793f8fa-8967-4860-8038-e78b4474f55c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.454522] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for the task: (returnval){ [ 1031.454522] env[67270]: value = "task-4110662" [ 1031.454522] env[67270]: _type = "Task" [ 1031.454522] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.463118] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Task: {'id': task-4110662, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.965878] env[67270]: DEBUG oslo_vmware.exceptions [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1031.966191] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1031.966763] env[67270]: ERROR nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.966763] env[67270]: Faults: ['InvalidArgument'] [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Traceback (most recent call last): [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] yield resources [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self.driver.spawn(context, instance, image_meta, [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self._fetch_image_if_missing(context, vi) [ 1031.966763] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] image_cache(vi, tmp_image_ds_loc) [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] vm_util.copy_virtual_disk( [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] session._wait_for_task(vmdk_copy_task) [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return self.wait_for_task(task_ref) [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return evt.wait() [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] result = hub.switch() [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1031.967150] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return self.greenlet.switch() [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self.f(*self.args, **self.kw) [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] raise exceptions.translate_fault(task_info.error) [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Faults: ['InvalidArgument'] [ 1031.967511] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] [ 1031.967511] env[67270]: INFO nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Terminating instance [ 1031.968718] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1031.969730] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1031.970365] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1031.970556] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1031.970827] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-45c78fbf-2ae5-4247-aa7b-b2ddc96869ad {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.973523] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934437c0-d4c3-4fe9-838e-98427dab7784 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.981283] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1031.981523] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-39c892d8-58e8-4f43-b950-3d4572a84e3e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.984087] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1031.984264] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1031.985293] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-04598cd7-f7b8-42f8-8a07-04b31b82b8ab {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.990796] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for the task: (returnval){ [ 1031.990796] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52a50375-2f50-a716-ed15-72e9c53f433d" [ 1031.990796] env[67270]: _type = "Task" [ 1031.990796] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.998862] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52a50375-2f50-a716-ed15-72e9c53f433d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.061367] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1032.061748] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1032.061961] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Deleting the datastore file [datastore1] 379f5a6d-d6d4-434a-b401-1b027434e6fd {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1032.062291] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7201c362-1fb6-4afd-9cbb-72693ecc0e49 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.069240] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for the task: (returnval){ [ 1032.069240] env[67270]: value = "task-4110664" [ 1032.069240] env[67270]: _type = "Task" [ 1032.069240] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1032.077560] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Task: {'id': task-4110664, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.501779] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1032.502133] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Creating directory with path [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1032.502323] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-91ef1778-8b97-4d1b-9663-a874eba40c99 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.515180] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Created directory with path [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1032.515317] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Fetch image to [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1032.515490] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1032.516295] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8b7ffb5-9189-474c-9258-266c62b7771b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.523932] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81de54c-0341-47c8-9f6b-0c7e4d8ccd1e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.534998] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a37dc5-1542-4975-af51-14075ed1f268 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.587689] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0548c2bd-3acf-4894-bc62-bf732670298e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.597150] env[67270]: DEBUG oslo_vmware.api [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Task: {'id': task-4110664, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069508} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1032.598843] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1032.599053] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1032.599232] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1032.599404] env[67270]: INFO nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1032.601709] env[67270]: DEBUG nova.compute.claims [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1032.601886] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1032.602141] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1032.605312] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-75cc0926-752a-40ca-86da-a45923d3e5b8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.630689] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1032.681540] env[67270]: DEBUG oslo_vmware.rw_handles [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1032.744678] env[67270]: DEBUG oslo_vmware.rw_handles [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1032.744866] env[67270]: DEBUG oslo_vmware.rw_handles [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1032.800463] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-966622aa-5a2f-4345-ad75-7fee2143c362 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.808738] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce70f759-d38e-4923-adea-5510152ea35b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.839806] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fb051cd-fbf0-4732-b542-f652abc761df {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.847893] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-239bb5e8-afdf-4489-954e-27244f7e4161 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.861943] env[67270]: DEBUG nova.compute.provider_tree [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.871284] env[67270]: DEBUG nova.scheduler.client.report [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.884722] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.282s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1032.885276] env[67270]: ERROR nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1032.885276] env[67270]: Faults: ['InvalidArgument'] [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Traceback (most recent call last): [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self.driver.spawn(context, instance, image_meta, [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self._fetch_image_if_missing(context, vi) [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] image_cache(vi, tmp_image_ds_loc) [ 1032.885276] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] vm_util.copy_virtual_disk( [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] session._wait_for_task(vmdk_copy_task) [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return self.wait_for_task(task_ref) [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return evt.wait() [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] result = hub.switch() [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] return self.greenlet.switch() [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1032.885600] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] self.f(*self.args, **self.kw) [ 1032.885914] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1032.885914] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] raise exceptions.translate_fault(task_info.error) [ 1032.885914] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1032.885914] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Faults: ['InvalidArgument'] [ 1032.885914] env[67270]: ERROR nova.compute.manager [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] [ 1032.886047] env[67270]: DEBUG nova.compute.utils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1032.887727] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Build of instance 379f5a6d-d6d4-434a-b401-1b027434e6fd was re-scheduled: A specified parameter was not correct: fileType [ 1032.887727] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1032.888124] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1032.888289] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1032.888439] env[67270]: DEBUG nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1032.888597] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1033.150958] env[67270]: DEBUG nova.network.neutron [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1033.162992] env[67270]: INFO nova.compute.manager [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Took 0.27 seconds to deallocate network for instance. [ 1033.255044] env[67270]: INFO nova.scheduler.client.report [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Deleted allocations for instance 379f5a6d-d6d4-434a-b401-1b027434e6fd [ 1033.274717] env[67270]: DEBUG oslo_concurrency.lockutils [None req-53aced9d-b031-4878-a703-8abc09bbd836 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 348.759s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.275981] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 151.255s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1033.276212] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Acquiring lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1033.276416] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1033.276818] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.279116] env[67270]: INFO nova.compute.manager [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Terminating instance [ 1033.281374] env[67270]: DEBUG nova.compute.manager [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1033.281374] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1033.282080] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-44b8442a-68ef-4840-841b-37fbe952e45b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.292874] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb9445c-b02e-4012-a094-abaacb025dcb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.304234] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1033.327171] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 379f5a6d-d6d4-434a-b401-1b027434e6fd could not be found. [ 1033.327403] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1033.327584] env[67270]: INFO nova.compute.manager [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1033.327829] env[67270]: DEBUG oslo.service.loopingcall [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1033.328072] env[67270]: DEBUG nova.compute.manager [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1033.328173] env[67270]: DEBUG nova.network.neutron [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1033.352872] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1033.352978] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1033.354583] env[67270]: INFO nova.compute.claims [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1033.358023] env[67270]: DEBUG nova.network.neutron [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1033.366171] env[67270]: INFO nova.compute.manager [-] [instance: 379f5a6d-d6d4-434a-b401-1b027434e6fd] Took 0.04 seconds to deallocate network for instance. [ 1033.460421] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3bdc04c3-34a9-4fed-8eb4-ab6c849e1895 tempest-ServerExternalEventsTest-787742656 tempest-ServerExternalEventsTest-787742656-project-member] Lock "379f5a6d-d6d4-434a-b401-1b027434e6fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.501577] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e98006-3c67-4e67-a0a6-d118e1065a00 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.510103] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-140ffb00-7aef-41c7-baea-d6aa113c6176 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.541160] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-856a3c5e-c2e9-45ba-b009-435c836043bf {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.549250] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76703f5c-adb4-450e-abca-680dd259314b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.564099] env[67270]: DEBUG nova.compute.provider_tree [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1033.572594] env[67270]: DEBUG nova.scheduler.client.report [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1033.587121] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.587516] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1033.620142] env[67270]: DEBUG nova.compute.utils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1033.621715] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1033.621908] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1033.630147] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1033.660381] env[67270]: INFO nova.virt.block_device [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Booting with volume 45a2521f-732e-4ca8-a8fe-33552aab49d8 at /dev/sda [ 1033.691885] env[67270]: DEBUG nova.policy [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f8a22c431abe4634a5436a796c91a18d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c34292ba9696423dbe2409fb50e939a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 1033.695045] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c6ddc0fb-e557-4337-b835-dc63565a3cff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.704558] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b919cbc9-6f6d-419d-8da8-5b507f3df392 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.733712] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-da7bda4e-c8c9-4449-a534-305e08f89d58 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.741560] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bf2e264-e4b7-45ab-9428-13d9bc558d7e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.770767] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fdf5d8b-17c8-48d7-80da-63d11ca38933 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.777526] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd8095b7-68f8-448a-8c39-d39115cb549f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.790554] env[67270]: DEBUG nova.virt.block_device [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating existing volume attachment record: 99586685-bb95-46ed-b6e1-d0d80983ff89 {{(pid=67270) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1034.022312] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1034.022932] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1034.023180] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1034.023340] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1034.023521] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1034.023663] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1034.023852] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1034.024089] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1034.024248] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1034.024410] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1034.024569] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1034.024735] env[67270]: DEBUG nova.virt.hardware [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1034.025696] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Successfully created port: 4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1034.028373] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adea2cd5-62e0-475c-9338-f1809ee14957 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1034.037943] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c64b709-5261-4c15-9d3d-e13636908300 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1034.930994] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Successfully updated port: 4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1034.947920] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1034.947920] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquired lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1034.947920] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1034.985521] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1035.184223] env[67270]: DEBUG nova.compute.manager [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Received event network-vif-plugged-4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1035.184454] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Acquiring lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1035.185475] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1035.185707] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1035.185905] env[67270]: DEBUG nova.compute.manager [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] No waiting events found dispatching network-vif-plugged-4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1035.186092] env[67270]: WARNING nova.compute.manager [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Received unexpected event network-vif-plugged-4541d405-fa82-485f-83dc-66275107feed for instance with vm_state building and task_state spawning. [ 1035.186262] env[67270]: DEBUG nova.compute.manager [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Received event network-changed-4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1035.186420] env[67270]: DEBUG nova.compute.manager [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Refreshing instance network info cache due to event network-changed-4541d405-fa82-485f-83dc-66275107feed. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1035.186595] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Acquiring lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1035.200220] env[67270]: DEBUG nova.network.neutron [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [{"id": "4541d405-fa82-485f-83dc-66275107feed", "address": "fa:16:3e:89:95:65", "network": {"id": "a4769043-fe42-412e-ab9c-b263c394496f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1260212331-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c34292ba9696423dbe2409fb50e939a5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "54c45719-5690-47bf-b45b-6cad9813071e", "external-id": "nsx-vlan-transportzone-62", "segmentation_id": 62, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4541d405-fa", "ovs_interfaceid": "4541d405-fa82-485f-83dc-66275107feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.219359] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Releasing lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1035.219359] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Instance network_info: |[{"id": "4541d405-fa82-485f-83dc-66275107feed", "address": "fa:16:3e:89:95:65", "network": {"id": "a4769043-fe42-412e-ab9c-b263c394496f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1260212331-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c34292ba9696423dbe2409fb50e939a5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "54c45719-5690-47bf-b45b-6cad9813071e", "external-id": "nsx-vlan-transportzone-62", "segmentation_id": 62, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4541d405-fa", "ovs_interfaceid": "4541d405-fa82-485f-83dc-66275107feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1035.219626] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Acquired lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1035.219626] env[67270]: DEBUG nova.network.neutron [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Refreshing network info cache for port 4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1035.219626] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:89:95:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '54c45719-5690-47bf-b45b-6cad9813071e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4541d405-fa82-485f-83dc-66275107feed', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1035.229053] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Creating folder: Project (c34292ba9696423dbe2409fb50e939a5). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.234022] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-056d6551-9993-471b-b716-dd8a807c741b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.248234] env[67270]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1035.248394] env[67270]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=67270) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1035.248722] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Folder already exists: Project (c34292ba9696423dbe2409fb50e939a5). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1035.248916] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Creating folder: Instances. Parent ref: group-v814303. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.249174] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8072887c-85e7-4496-afe9-be63ee724bab {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.259623] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Created folder: Instances in parent group-v814303. [ 1035.259873] env[67270]: DEBUG oslo.service.loopingcall [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1035.260078] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1035.260281] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b9a52ae0-fc89-4cf1-a59e-bb079b6fb82f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.286768] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1035.286768] env[67270]: value = "task-4110667" [ 1035.286768] env[67270]: _type = "Task" [ 1035.286768] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1035.295282] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110667, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1035.579057] env[67270]: DEBUG nova.network.neutron [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updated VIF entry in instance network info cache for port 4541d405-fa82-485f-83dc-66275107feed. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1035.579533] env[67270]: DEBUG nova.network.neutron [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [{"id": "4541d405-fa82-485f-83dc-66275107feed", "address": "fa:16:3e:89:95:65", "network": {"id": "a4769043-fe42-412e-ab9c-b263c394496f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1260212331-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c34292ba9696423dbe2409fb50e939a5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "54c45719-5690-47bf-b45b-6cad9813071e", "external-id": "nsx-vlan-transportzone-62", "segmentation_id": 62, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4541d405-fa", "ovs_interfaceid": "4541d405-fa82-485f-83dc-66275107feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.619982] env[67270]: DEBUG oslo_concurrency.lockutils [req-4febc03c-a5e7-43a9-b9e7-08e13e01884a req-71e5140d-cf53-4f5a-aa28-37ebd2276b92 service nova] Releasing lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1035.799115] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110667, 'name': CreateVM_Task, 'duration_secs': 0.311295} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1035.799308] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1035.800014] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'guest_format': None, 'boot_index': 0, 'disk_bus': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814312', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'name': 'volume-45a2521f-732e-4ca8-a8fe-33552aab49d8', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3273613a-db47-4af9-b3a5-d0dedffd3332', 'attached_at': '', 'detached_at': '', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'serial': '45a2521f-732e-4ca8-a8fe-33552aab49d8'}, 'attachment_id': '99586685-bb95-46ed-b6e1-d0d80983ff89', 'device_type': None, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=67270) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1035.800238] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Root volume attach. Driver type: vmdk {{(pid=67270) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1035.801023] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-609e8fe8-3661-40d8-b334-9c22966d823a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.809719] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b980dc9d-d317-46d5-b057-0b74699511b9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.816803] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bddbff5-f894-484d-8bfd-417dfb7a2435 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.823212] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-8c2ebf6f-cd20-44f8-b72b-f06d3ed2b5d0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1035.831277] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1035.831277] env[67270]: value = "task-4110668" [ 1035.831277] env[67270]: _type = "Task" [ 1035.831277] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1035.839769] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110668, 'name': RelocateVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1036.341544] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110668, 'name': RelocateVM_Task, 'duration_secs': 0.359944} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1036.341968] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Volume attach. Driver type: vmdk {{(pid=67270) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1036.342203] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814312', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'name': 'volume-45a2521f-732e-4ca8-a8fe-33552aab49d8', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3273613a-db47-4af9-b3a5-d0dedffd3332', 'attached_at': '', 'detached_at': '', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'serial': '45a2521f-732e-4ca8-a8fe-33552aab49d8'} {{(pid=67270) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1036.343061] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc3cfd09-a903-4151-8c63-d5b00e2d3a79 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.361665] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c87e427-08a8-4d60-819f-da32696f47c8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.387298] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Reconfiguring VM instance instance-00000026 to attach disk [datastore1] volume-45a2521f-732e-4ca8-a8fe-33552aab49d8/volume-45a2521f-732e-4ca8-a8fe-33552aab49d8.vmdk or device None with type thin {{(pid=67270) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1036.387731] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-5b4367b6-755d-4255-b20b-a601a92fcd6c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.409602] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1036.409602] env[67270]: value = "task-4110669" [ 1036.409602] env[67270]: _type = "Task" [ 1036.409602] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1036.419261] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110669, 'name': ReconfigVM_Task} progress is 6%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1036.920945] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110669, 'name': ReconfigVM_Task, 'duration_secs': 0.269999} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1036.920945] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Reconfigured VM instance instance-00000026 to attach disk [datastore1] volume-45a2521f-732e-4ca8-a8fe-33552aab49d8/volume-45a2521f-732e-4ca8-a8fe-33552aab49d8.vmdk or device None with type thin {{(pid=67270) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1036.925627] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-aee6bc4c-131e-427c-9656-1e117c46780c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1036.942464] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1036.942464] env[67270]: value = "task-4110670" [ 1036.942464] env[67270]: _type = "Task" [ 1036.942464] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1036.955127] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110670, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1037.455893] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110670, 'name': ReconfigVM_Task, 'duration_secs': 0.123465} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1037.456746] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814312', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'name': 'volume-45a2521f-732e-4ca8-a8fe-33552aab49d8', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3273613a-db47-4af9-b3a5-d0dedffd3332', 'attached_at': '', 'detached_at': '', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'serial': '45a2521f-732e-4ca8-a8fe-33552aab49d8'} {{(pid=67270) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1037.457455] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-bdbb6284-fa03-4625-afc3-d83f3fec512a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1037.465515] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1037.465515] env[67270]: value = "task-4110671" [ 1037.465515] env[67270]: _type = "Task" [ 1037.465515] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1037.475625] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110671, 'name': Rename_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1037.978808] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110671, 'name': Rename_Task, 'duration_secs': 0.132273} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1037.981748] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Powering on the VM {{(pid=67270) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1037.981748] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-398af512-c1f1-4e40-9829-2a86e8bf1c3b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1037.990019] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1037.990019] env[67270]: value = "task-4110672" [ 1037.990019] env[67270]: _type = "Task" [ 1037.990019] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1037.998371] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110672, 'name': PowerOnVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1038.499190] env[67270]: DEBUG oslo_vmware.api [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110672, 'name': PowerOnVM_Task, 'duration_secs': 0.490602} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1038.499574] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Powered on the VM {{(pid=67270) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1038.499671] env[67270]: INFO nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Took 4.48 seconds to spawn the instance on the hypervisor. [ 1038.499850] env[67270]: DEBUG nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Checking state {{(pid=67270) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1038.500683] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9221bde5-71b8-47ed-83bd-a81f2f725e63 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1038.560310] env[67270]: INFO nova.compute.manager [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Took 5.22 seconds to build instance. [ 1038.570850] env[67270]: DEBUG oslo_concurrency.lockutils [None req-56e807b4-421b-4c36-9557-872cef67d666 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.096s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1039.527385] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "65509bc1-a140-416a-a465-4c9e6efce4a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1039.527771] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "65509bc1-a140-416a-a465-4c9e6efce4a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1039.539752] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1039.595288] env[67270]: DEBUG nova.compute.manager [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Received event network-changed-4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1039.595481] env[67270]: DEBUG nova.compute.manager [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Refreshing instance network info cache due to event network-changed-4541d405-fa82-485f-83dc-66275107feed. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1039.595708] env[67270]: DEBUG oslo_concurrency.lockutils [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] Acquiring lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1039.595854] env[67270]: DEBUG oslo_concurrency.lockutils [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] Acquired lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1039.597358] env[67270]: DEBUG nova.network.neutron [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Refreshing network info cache for port 4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1039.605509] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1039.605767] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1039.608264] env[67270]: INFO nova.compute.claims [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1039.786701] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b80e947b-6fb6-47cf-8b74-21d6f1f8efbc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1039.795873] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61023232-0d51-4418-a8c7-855271c5b02b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1039.835178] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0de93942-f20f-4064-b30b-a5bab5d6ccd4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1039.844945] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c6afc02-28f5-4789-9eb2-1a2d6fb2cc5d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1039.861470] env[67270]: DEBUG nova.compute.provider_tree [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1039.873021] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1039.888163] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1039.888960] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1039.927303] env[67270]: DEBUG nova.compute.utils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1039.934301] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1039.934493] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1039.947214] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1040.016899] env[67270]: DEBUG nova.policy [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ee60467f52e84dfabea7bb372d65141e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74eeac5d45074a65b8b53ec94a30ef0c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 1040.033662] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1040.062934] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1040.063207] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1040.063364] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1040.063545] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1040.063687] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1040.063896] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1040.064266] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1040.064448] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1040.064620] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1040.065545] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1040.065742] env[67270]: DEBUG nova.virt.hardware [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1040.066628] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e988361-66ae-48fb-9760-677678011b82 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1040.076648] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada935d4-93e8-459f-ba8a-c506b571b5a3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1040.191903] env[67270]: DEBUG nova.network.neutron [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updated VIF entry in instance network info cache for port 4541d405-fa82-485f-83dc-66275107feed. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1040.192060] env[67270]: DEBUG nova.network.neutron [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [{"id": "4541d405-fa82-485f-83dc-66275107feed", "address": "fa:16:3e:89:95:65", "network": {"id": "a4769043-fe42-412e-ab9c-b263c394496f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1260212331-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.129", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c34292ba9696423dbe2409fb50e939a5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "54c45719-5690-47bf-b45b-6cad9813071e", "external-id": "nsx-vlan-transportzone-62", "segmentation_id": 62, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4541d405-fa", "ovs_interfaceid": "4541d405-fa82-485f-83dc-66275107feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1040.204964] env[67270]: DEBUG oslo_concurrency.lockutils [req-ff06d6f1-1d21-4357-ad49-40cad6c06cd7 req-8ff2d147-7e03-4041-aab9-b92a0a8f09d3 service nova] Releasing lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1040.425027] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Successfully created port: 0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1040.987629] env[67270]: DEBUG nova.compute.manager [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Received event network-vif-plugged-0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1040.987943] env[67270]: DEBUG oslo_concurrency.lockutils [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] Acquiring lock "65509bc1-a140-416a-a465-4c9e6efce4a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1040.988159] env[67270]: DEBUG oslo_concurrency.lockutils [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] Lock "65509bc1-a140-416a-a465-4c9e6efce4a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1040.988372] env[67270]: DEBUG oslo_concurrency.lockutils [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] Lock "65509bc1-a140-416a-a465-4c9e6efce4a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1040.988619] env[67270]: DEBUG nova.compute.manager [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] No waiting events found dispatching network-vif-plugged-0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1040.988812] env[67270]: WARNING nova.compute.manager [req-ba3a283d-7634-4624-b145-f03fd8a209d0 req-4f70ce51-2ac5-4f71-ac50-7b00b240cd26 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Received unexpected event network-vif-plugged-0426c809-3240-4de0-80a0-edaebb3eb132 for instance with vm_state building and task_state spawning. [ 1041.045161] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Successfully updated port: 0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1041.057896] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1041.058084] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquired lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1041.058260] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1041.101262] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1041.316590] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Updating instance_info_cache with network_info: [{"id": "0426c809-3240-4de0-80a0-edaebb3eb132", "address": "fa:16:3e:5a:2b:a9", "network": {"id": "b1453c02-413a-4c8e-9a64-7dac3ec1cf97", "bridge": "br-int", "label": "tempest-ServersTestJSON-274866854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "74eeac5d45074a65b8b53ec94a30ef0c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0426c809-32", "ovs_interfaceid": "0426c809-3240-4de0-80a0-edaebb3eb132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1041.327675] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Releasing lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1041.327987] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Instance network_info: |[{"id": "0426c809-3240-4de0-80a0-edaebb3eb132", "address": "fa:16:3e:5a:2b:a9", "network": {"id": "b1453c02-413a-4c8e-9a64-7dac3ec1cf97", "bridge": "br-int", "label": "tempest-ServersTestJSON-274866854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "74eeac5d45074a65b8b53ec94a30ef0c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0426c809-32", "ovs_interfaceid": "0426c809-3240-4de0-80a0-edaebb3eb132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1041.328606] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5a:2b:a9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46e1fc20-2067-4e1a-9812-702772a2c82c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0426c809-3240-4de0-80a0-edaebb3eb132', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1041.336126] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Creating folder: Project (74eeac5d45074a65b8b53ec94a30ef0c). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1041.336634] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee8a2343-b161-4aa3-8cb1-d95a4ef8a0e8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.349429] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Created folder: Project (74eeac5d45074a65b8b53ec94a30ef0c) in parent group-v814248. [ 1041.349795] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Creating folder: Instances. Parent ref: group-v814315. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1041.350374] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-de43ed9f-1433-4eed-ad34-7338407fcdae {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.363647] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Created folder: Instances in parent group-v814315. [ 1041.363915] env[67270]: DEBUG oslo.service.loopingcall [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1041.364128] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1041.364346] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-587bf5d2-268d-48da-8ee8-ab5121e8fb80 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.384826] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1041.384826] env[67270]: value = "task-4110675" [ 1041.384826] env[67270]: _type = "Task" [ 1041.384826] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.393812] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110675, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1041.896081] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110675, 'name': CreateVM_Task, 'duration_secs': 0.328246} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1041.896081] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1041.896081] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1041.896349] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1041.896562] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1041.896807] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-609eab3d-675e-4d5e-bd52-d54d1749295e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.902105] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Waiting for the task: (returnval){ [ 1041.902105] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52d290b4-624b-9932-2785-87219f6e13d1" [ 1041.902105] env[67270]: _type = "Task" [ 1041.902105] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.912866] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52d290b4-624b-9932-2785-87219f6e13d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.412185] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1042.412622] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1042.412622] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1043.014455] env[67270]: DEBUG nova.compute.manager [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Received event network-changed-0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1043.014641] env[67270]: DEBUG nova.compute.manager [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Refreshing instance network info cache due to event network-changed-0426c809-3240-4de0-80a0-edaebb3eb132. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1043.014852] env[67270]: DEBUG oslo_concurrency.lockutils [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] Acquiring lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1043.014995] env[67270]: DEBUG oslo_concurrency.lockutils [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] Acquired lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1043.015169] env[67270]: DEBUG nova.network.neutron [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Refreshing network info cache for port 0426c809-3240-4de0-80a0-edaebb3eb132 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1043.275025] env[67270]: DEBUG nova.network.neutron [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Updated VIF entry in instance network info cache for port 0426c809-3240-4de0-80a0-edaebb3eb132. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1043.275401] env[67270]: DEBUG nova.network.neutron [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Updating instance_info_cache with network_info: [{"id": "0426c809-3240-4de0-80a0-edaebb3eb132", "address": "fa:16:3e:5a:2b:a9", "network": {"id": "b1453c02-413a-4c8e-9a64-7dac3ec1cf97", "bridge": "br-int", "label": "tempest-ServersTestJSON-274866854-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "74eeac5d45074a65b8b53ec94a30ef0c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46e1fc20-2067-4e1a-9812-702772a2c82c", "external-id": "nsx-vlan-transportzone-210", "segmentation_id": 210, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0426c809-32", "ovs_interfaceid": "0426c809-3240-4de0-80a0-edaebb3eb132", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.286444] env[67270]: DEBUG oslo_concurrency.lockutils [req-33e63e0a-e8ca-44e8-914b-c434251522f4 req-92dff061-acab-4ee5-8c14-e0fd65bd7e20 service nova] Releasing lock "refresh_cache-65509bc1-a140-416a-a465-4c9e6efce4a0" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1048.927158] env[67270]: WARNING oslo_vmware.rw_handles [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1048.927158] env[67270]: ERROR oslo_vmware.rw_handles [ 1048.927990] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1048.929326] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1048.929632] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Copying Virtual Disk [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore2] vmware_temp/5c6d8c52-9c40-46f5-9222-883bc3a038ea/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1048.930056] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f5695465-1aa5-4c27-84c0-ff9d49c12fe1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.939743] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Waiting for the task: (returnval){ [ 1048.939743] env[67270]: value = "task-4110676" [ 1048.939743] env[67270]: _type = "Task" [ 1048.939743] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1048.949285] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Task: {'id': task-4110676, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1049.450958] env[67270]: DEBUG oslo_vmware.exceptions [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1049.451266] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1049.451862] env[67270]: ERROR nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1049.451862] env[67270]: Faults: ['InvalidArgument'] [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Traceback (most recent call last): [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] yield resources [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] self.driver.spawn(context, instance, image_meta, [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] self._fetch_image_if_missing(context, vi) [ 1049.451862] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] image_cache(vi, tmp_image_ds_loc) [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] vm_util.copy_virtual_disk( [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] session._wait_for_task(vmdk_copy_task) [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] return self.wait_for_task(task_ref) [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] return evt.wait() [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] result = hub.switch() [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1049.452282] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] return self.greenlet.switch() [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] self.f(*self.args, **self.kw) [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] raise exceptions.translate_fault(task_info.error) [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Faults: ['InvalidArgument'] [ 1049.452640] env[67270]: ERROR nova.compute.manager [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] [ 1049.452640] env[67270]: INFO nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Terminating instance [ 1049.453768] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1049.453977] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1049.454284] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-896314cb-bea9-4afa-8b96-8540e9d34eca {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.457632] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1049.457821] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1049.458675] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f4f4d0a-3fb4-4ca0-95da-cb9699335f87 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.463082] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1049.463258] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1049.463933] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8bd7dfa-a641-47c0-a01d-2e4a3056c6ae {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.468012] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1049.468535] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dfb2ba1e-abad-4bf8-906e-8a40138d6713 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.471490] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for the task: (returnval){ [ 1049.471490] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52222350-f6e5-5239-0214-7d7088d3f914" [ 1049.471490] env[67270]: _type = "Task" [ 1049.471490] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1049.480065] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52222350-f6e5-5239-0214-7d7088d3f914, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1049.541068] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1049.541068] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Deleting contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1049.541068] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Deleting the datastore file [datastore2] 87ef9733-e8d6-429e-b23f-8b8aadef784c {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1049.541482] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-10dc8a52-ebf3-406e-8ff2-7e24f902f8a7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.550062] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Waiting for the task: (returnval){ [ 1049.550062] env[67270]: value = "task-4110678" [ 1049.550062] env[67270]: _type = "Task" [ 1049.550062] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1049.560084] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Task: {'id': task-4110678, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1049.982076] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1049.982394] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Creating directory with path [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1049.982694] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1cd75de5-f06d-41c4-802e-2b2a604f6894 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.997567] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Created directory with path [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1049.997781] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Fetch image to [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1049.997956] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1049.998771] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8537ca21-1028-4d0e-aaf8-f10df37ac97c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.006543] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd7907e-5b52-4734-bb2a-5506dfdc9d09 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.018048] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46302147-9d38-46bd-a5a9-d0041a6d6587 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.049382] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8745b48-f46d-4245-8995-bc06355f52dc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.061812] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5ab7a8c4-ef38-4592-a396-5a9387d60d2e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.063712] env[67270]: DEBUG oslo_vmware.api [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Task: {'id': task-4110678, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072085} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1050.063956] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1050.064377] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Deleted contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1050.064570] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1050.064744] env[67270]: INFO nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1050.066870] env[67270]: DEBUG nova.compute.claims [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1050.067063] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1050.067287] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1050.095053] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1050.097695] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1050.098098] env[67270]: DEBUG nova.compute.utils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance 87ef9733-e8d6-429e-b23f-8b8aadef784c could not be found. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1050.099645] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance disappeared during build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1050.099812] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1050.099975] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1050.100156] env[67270]: DEBUG nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1050.100315] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1050.128267] env[67270]: DEBUG nova.network.neutron [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1050.138124] env[67270]: INFO nova.compute.manager [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Took 0.04 seconds to deallocate network for instance. [ 1050.143485] env[67270]: DEBUG oslo_vmware.rw_handles [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1050.201713] env[67270]: DEBUG oslo_vmware.rw_handles [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1050.201907] env[67270]: DEBUG oslo_vmware.rw_handles [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1050.221346] env[67270]: DEBUG oslo_concurrency.lockutils [None req-195bd5cc-acc9-48ea-a802-6a1a5003f1be tempest-ServerAddressesNegativeTestJSON-849171022 tempest-ServerAddressesNegativeTestJSON-849171022-project-member] Lock "87ef9733-e8d6-429e-b23f-8b8aadef784c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 285.925s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1054.104692] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.105180] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1054.105180] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1054.123507] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.123740] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.123779] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.123912] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.124074] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.124192] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1054.151819] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1054.151984] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquired lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1054.152159] env[67270]: DEBUG nova.network.neutron [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Forcefully refreshing network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2004}} [ 1054.152322] env[67270]: DEBUG nova.objects.instance [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lazy-loading 'info_cache' on Instance uuid 3273613a-db47-4af9-b3a5-d0dedffd3332 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1054.517952] env[67270]: DEBUG nova.network.neutron [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [{"id": "4541d405-fa82-485f-83dc-66275107feed", "address": "fa:16:3e:89:95:65", "network": {"id": "a4769043-fe42-412e-ab9c-b263c394496f", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1260212331-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.129", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c34292ba9696423dbe2409fb50e939a5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "54c45719-5690-47bf-b45b-6cad9813071e", "external-id": "nsx-vlan-transportzone-62", "segmentation_id": 62, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4541d405-fa", "ovs_interfaceid": "4541d405-fa82-485f-83dc-66275107feed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1054.528393] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Releasing lock "refresh_cache-3273613a-db47-4af9-b3a5-d0dedffd3332" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1054.528592] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updated the network info_cache for instance {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9885}} [ 1054.758034] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.758415] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.758579] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1055.757581] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.757870] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.758014] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.758211] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.767509] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.767725] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.767895] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.768069] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1055.769148] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c2eb17d-b9d9-4fac-a1b0-05cbba4e9d5e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.778089] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4926137d-6bba-40e4-ac15-d8be17416a6c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.792348] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d56e024-a668-4964-8e0b-34da8c477753 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.798631] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7d4633-008e-4159-a11f-02d3231d7a31 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.827768] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180826MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1055.827899] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.828104] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.883761] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.883925] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884087] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884319] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884478] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 39ead031-10c5-40e3-ba91-9d34334398f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884603] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 3273613a-db47-4af9-b3a5-d0dedffd3332 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884723] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65509bc1-a140-416a-a465-4c9e6efce4a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1055.884920] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1055.885078] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1055.942566] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "3273613a-db47-4af9-b3a5-d0dedffd3332" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.942853] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.943069] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1055.943254] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1055.943415] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1055.945347] env[67270]: INFO nova.compute.manager [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Terminating instance [ 1055.947395] env[67270]: DEBUG nova.compute.manager [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1055.948118] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Powering off the VM {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1055.949029] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-b7a6564a-5567-497d-98b0-ac07cae83b4b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1055.956603] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1055.956603] env[67270]: value = "task-4110679" [ 1055.956603] env[67270]: _type = "Task" [ 1055.956603] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1055.966191] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110679, 'name': PowerOffVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1055.993376] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cb054af-4aad-4022-8017-7066684e8e31 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.002177] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33976808-699c-4c93-9692-866b20cfcd75 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.034643] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97f6d5a1-8d2b-480c-ab41-0c1ae7fc2f73 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.043211] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85efbee8-c793-41fd-9257-3a2e7a1e3cbc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.057562] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1056.067046] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1056.083616] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1056.083832] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1056.468714] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110679, 'name': PowerOffVM_Task, 'duration_secs': 0.215251} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1056.468964] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Powered off the VM {{(pid=67270) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1056.469178] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Volume detach. Driver type: vmdk {{(pid=67270) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1056.469370] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814312', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'name': 'volume-45a2521f-732e-4ca8-a8fe-33552aab49d8', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3273613a-db47-4af9-b3a5-d0dedffd3332', 'attached_at': '', 'detached_at': '', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'serial': '45a2521f-732e-4ca8-a8fe-33552aab49d8'} {{(pid=67270) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1056.470267] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29206e09-718e-43b2-a3c2-a5d321a96317 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.489527] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acc26d8f-303d-42e0-a1cb-5e32aaa59fa0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.497125] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0df9160-8bde-429c-99e2-3a6e2706c2da {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.516139] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20873e1e-d223-492f-a58b-4803b080afdb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.531637] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] The volume has not been displaced from its original location: [datastore1] volume-45a2521f-732e-4ca8-a8fe-33552aab49d8/volume-45a2521f-732e-4ca8-a8fe-33552aab49d8.vmdk. No consolidation needed. {{(pid=67270) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1056.536982] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Reconfiguring VM instance instance-00000026 to detach disk 2000 {{(pid=67270) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1056.537286] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-d119bfd6-26f5-4d24-88b6-aa8aaa83f46d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1056.555927] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1056.555927] env[67270]: value = "task-4110680" [ 1056.555927] env[67270]: _type = "Task" [ 1056.555927] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1056.564542] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110680, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.066218] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110680, 'name': ReconfigVM_Task, 'duration_secs': 0.158244} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1057.066566] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Reconfigured VM instance instance-00000026 to detach disk 2000 {{(pid=67270) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1057.071395] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-ad4b65b1-b89f-4bf5-8cd3-d7226c80a7d7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.081679] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.081912] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.088483] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1057.088483] env[67270]: value = "task-4110681" [ 1057.088483] env[67270]: _type = "Task" [ 1057.088483] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.097251] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110681, 'name': ReconfigVM_Task} progress is 5%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.598595] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110681, 'name': ReconfigVM_Task, 'duration_secs': 0.151947} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1057.598894] env[67270]: DEBUG nova.virt.vmwareapi.volumeops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-814312', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'name': 'volume-45a2521f-732e-4ca8-a8fe-33552aab49d8', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '3273613a-db47-4af9-b3a5-d0dedffd3332', 'attached_at': '', 'detached_at': '', 'volume_id': '45a2521f-732e-4ca8-a8fe-33552aab49d8', 'serial': '45a2521f-732e-4ca8-a8fe-33552aab49d8'} {{(pid=67270) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1057.599190] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1057.599936] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17ebf440-3759-4b9f-a93c-3e769629d99e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.606779] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1057.607012] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d554e2d2-2416-458f-a248-731344b57a72 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.797535] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1057.797714] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1057.797895] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Deleting the datastore file [datastore1] 3273613a-db47-4af9-b3a5-d0dedffd3332 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1057.798197] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9f29c63e-b3f0-4694-b30e-47d6768b39b3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.805641] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for the task: (returnval){ [ 1057.805641] env[67270]: value = "task-4110683" [ 1057.805641] env[67270]: _type = "Task" [ 1057.805641] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.813580] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110683, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1058.316212] env[67270]: DEBUG oslo_vmware.api [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Task: {'id': task-4110683, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088897} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1058.316554] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.316676] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.316842] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.317020] env[67270]: INFO nova.compute.manager [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Took 2.37 seconds to destroy the instance on the hypervisor. [ 1058.317261] env[67270]: DEBUG oslo.service.loopingcall [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1058.317468] env[67270]: DEBUG nova.compute.manager [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1058.317620] env[67270]: DEBUG nova.network.neutron [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1058.794090] env[67270]: DEBUG nova.network.neutron [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1058.805407] env[67270]: INFO nova.compute.manager [-] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Took 0.49 seconds to deallocate network for instance. [ 1058.811166] env[67270]: DEBUG nova.compute.manager [req-35343093-00e1-4bb2-a3f4-d7c48aa2177c req-f3c36df3-3b6b-42a4-bbe7-8b127186113e service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Received event network-vif-deleted-4541d405-fa82-485f-83dc-66275107feed {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1058.811368] env[67270]: INFO nova.compute.manager [req-35343093-00e1-4bb2-a3f4-d7c48aa2177c req-f3c36df3-3b6b-42a4-bbe7-8b127186113e service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Neutron deleted interface 4541d405-fa82-485f-83dc-66275107feed; detaching it from the instance and deleting it from the info cache [ 1058.811534] env[67270]: DEBUG nova.network.neutron [req-35343093-00e1-4bb2-a3f4-d7c48aa2177c req-f3c36df3-3b6b-42a4-bbe7-8b127186113e service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1058.820574] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6ea91389-9b96-4da1-ac4e-99adc6b485d5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.831745] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3989cf27-ca3a-4511-a589-f7387e2aa8ee {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.867998] env[67270]: DEBUG nova.compute.manager [req-35343093-00e1-4bb2-a3f4-d7c48aa2177c req-f3c36df3-3b6b-42a4-bbe7-8b127186113e service nova] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Detach interface failed, port_id=4541d405-fa82-485f-83dc-66275107feed, reason: Instance 3273613a-db47-4af9-b3a5-d0dedffd3332 could not be found. {{(pid=67270) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10838}} [ 1058.869199] env[67270]: INFO nova.compute.manager [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Took 0.06 seconds to detach 1 volumes for instance. [ 1058.871337] env[67270]: DEBUG nova.compute.manager [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Deleting volume: 45a2521f-732e-4ca8-a8fe-33552aab49d8 {{(pid=67270) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 1058.935916] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1058.936227] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1058.936429] env[67270]: DEBUG nova.objects.instance [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lazy-loading 'resources' on Instance uuid 3273613a-db47-4af9-b3a5-d0dedffd3332 {{(pid=67270) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1059.066024] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca9c993e-50c5-454c-b1ab-03358610b917 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1059.075477] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-458ee96b-cfab-4c2b-ac71-46d965cb3111 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1059.108911] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-051c1060-bb4c-4383-be77-c9867c0694bc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1059.117064] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8741f2e0-da54-4b81-9523-6fe87f75ac22 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1059.131112] env[67270]: DEBUG nova.compute.provider_tree [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1059.145635] env[67270]: DEBUG nova.scheduler.client.report [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1059.161114] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.225s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1059.181102] env[67270]: INFO nova.scheduler.client.report [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Deleted allocations for instance 3273613a-db47-4af9-b3a5-d0dedffd3332 [ 1059.231525] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28ffc53b-789a-4b7c-a111-6c70ff3aafe0 tempest-ServersTestBootFromVolume-1434943026 tempest-ServersTestBootFromVolume-1434943026-project-member] Lock "3273613a-db47-4af9-b3a5-d0dedffd3332" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.289s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1081.458778] env[67270]: WARNING oslo_vmware.rw_handles [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1081.458778] env[67270]: ERROR oslo_vmware.rw_handles [ 1081.459543] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1081.460978] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1081.461238] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Copying Virtual Disk [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/a5c93f1a-5851-4c04-a0cf-e31f3e39e7c9/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1081.461522] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-696e244a-4283-4df3-b131-22bcc9e74c20 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.470134] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for the task: (returnval){ [ 1081.470134] env[67270]: value = "task-4110685" [ 1081.470134] env[67270]: _type = "Task" [ 1081.470134] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1081.478647] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Task: {'id': task-4110685, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1081.981459] env[67270]: DEBUG oslo_vmware.exceptions [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1081.981670] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1081.982253] env[67270]: ERROR nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1081.982253] env[67270]: Faults: ['InvalidArgument'] [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Traceback (most recent call last): [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] yield resources [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self.driver.spawn(context, instance, image_meta, [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self._fetch_image_if_missing(context, vi) [ 1081.982253] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] image_cache(vi, tmp_image_ds_loc) [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] vm_util.copy_virtual_disk( [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] session._wait_for_task(vmdk_copy_task) [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return self.wait_for_task(task_ref) [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return evt.wait() [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] result = hub.switch() [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1081.982727] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return self.greenlet.switch() [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self.f(*self.args, **self.kw) [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] raise exceptions.translate_fault(task_info.error) [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Faults: ['InvalidArgument'] [ 1081.983266] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] [ 1081.983266] env[67270]: INFO nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Terminating instance [ 1081.984168] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1081.984378] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1081.984620] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-312dee57-a8d2-48c0-917b-ca09650608bb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.986879] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1081.987098] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1081.987836] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea86e5f-f9d3-4b9f-bd07-87a6f85ba773 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.995501] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1081.995610] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-34699ad6-9ee3-43a0-8f46-62f6216d85c9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.999128] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1081.999321] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1082.000051] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6d9eec1-a3ef-44b7-abd6-5fa77542fe0e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.005220] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Waiting for the task: (returnval){ [ 1082.005220] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]521b31a0-98a0-246c-6c8e-fc036a8d5fd5" [ 1082.005220] env[67270]: _type = "Task" [ 1082.005220] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1082.012800] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]521b31a0-98a0-246c-6c8e-fc036a8d5fd5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1082.073521] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1082.073821] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1082.073921] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Deleting the datastore file [datastore1] a073c7a9-d7ee-4d9e-be23-4345ed5f9047 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1082.074182] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6cbc2d0f-a0cc-4696-be68-7b8771b9b29d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.080862] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for the task: (returnval){ [ 1082.080862] env[67270]: value = "task-4110687" [ 1082.080862] env[67270]: _type = "Task" [ 1082.080862] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1082.088909] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Task: {'id': task-4110687, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1082.515646] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1082.516046] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Creating directory with path [datastore1] vmware_temp/1cc0732a-19fe-407b-b9a7-8fcd2ab44607/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1082.516165] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2584555f-9ef0-497e-b769-01759f3674d1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.528740] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Created directory with path [datastore1] vmware_temp/1cc0732a-19fe-407b-b9a7-8fcd2ab44607/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1082.528920] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Fetch image to [datastore1] vmware_temp/1cc0732a-19fe-407b-b9a7-8fcd2ab44607/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1082.529114] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/1cc0732a-19fe-407b-b9a7-8fcd2ab44607/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1082.529871] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77945159-ec98-4b4f-b251-4216a5c76bf3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.538520] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b621899a-6ae1-4e92-a0c4-daa85e87f164 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.548378] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aca2302-f4ea-46dc-b78d-5bdc3ba6fd93 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.580589] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb00796b-e7cd-4a8a-a06c-971ebdebe858 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.592760] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e3a87196-f970-475e-a00f-8cc55fb9176a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.594598] env[67270]: DEBUG oslo_vmware.api [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Task: {'id': task-4110687, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.094258} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1082.594833] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1082.595025] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1082.595205] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1082.595375] env[67270]: INFO nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1082.597625] env[67270]: DEBUG nova.compute.claims [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1082.597809] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1082.598031] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1082.616929] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1082.727290] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77d18c8d-a3af-414c-9dd4-36fd6963be80 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.735448] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e6721d1-7f82-466b-ab90-88c822184fbc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.769308] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9987436f-cf60-4a6e-8326-0eaf913e7426 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.777618] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ea7a73c-16c3-44c2-8a23-075d3fe94b42 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.792457] env[67270]: DEBUG nova.compute.provider_tree [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1082.801598] env[67270]: DEBUG nova.scheduler.client.report [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1082.816683] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.217s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1082.816683] env[67270]: ERROR nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1082.816683] env[67270]: Faults: ['InvalidArgument'] [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Traceback (most recent call last): [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self.driver.spawn(context, instance, image_meta, [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1082.816683] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self._fetch_image_if_missing(context, vi) [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] image_cache(vi, tmp_image_ds_loc) [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] vm_util.copy_virtual_disk( [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] session._wait_for_task(vmdk_copy_task) [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return self.wait_for_task(task_ref) [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return evt.wait() [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] result = hub.switch() [ 1082.817082] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] return self.greenlet.switch() [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] self.f(*self.args, **self.kw) [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] raise exceptions.translate_fault(task_info.error) [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Faults: ['InvalidArgument'] [ 1082.817742] env[67270]: ERROR nova.compute.manager [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] [ 1082.817742] env[67270]: DEBUG nova.compute.utils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1082.821642] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1082.823293] env[67270]: ERROR nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] result = getattr(controller, method)(*args, **kwargs) [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._get(image_id) [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1082.823293] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] resp, body = self.http_client.get(url, headers=header) [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.request(url, 'GET', **kwargs) [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._handle_response(resp) [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exc.from_response(resp, resp.content) [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] During handling of the above exception, another exception occurred: [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1082.823702] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] yield resources [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.driver.spawn(context, instance, image_meta, [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._fetch_image_if_missing(context, vi) [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image_fetch(context, vi, tmp_image_ds_loc) [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] images.fetch_image( [ 1082.824093] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] metadata = IMAGE_API.get(context, image_ref) [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return session.show(context, image_id, [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] _reraise_translated_image_exception(image_id) [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise new_exc.with_traceback(exc_trace) [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] result = getattr(controller, method)(*args, **kwargs) [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1082.824546] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._get(image_id) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] resp, body = self.http_client.get(url, headers=header) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.request(url, 'GET', **kwargs) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._handle_response(resp) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exc.from_response(resp, resp.content) [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1082.824939] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1082.825434] env[67270]: INFO nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Terminating instance [ 1082.825557] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1082.825769] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1082.826477] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Build of instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 was re-scheduled: A specified parameter was not correct: fileType [ 1082.826477] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1082.826861] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1082.827043] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1082.827219] env[67270]: DEBUG nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1082.827384] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1082.829047] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9cb45dac-bc64-453f-ace7-a7d5e0835310 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.839932] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1082.840271] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1082.841997] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5bd4b871-aa48-42b4-957f-788a927cb48e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.848627] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1082.848984] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1082.850300] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44731122-abf6-4b75-937e-e60139599c1d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.857212] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for the task: (returnval){ [ 1082.857212] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52e450f6-3710-3fdd-36d3-b3fe68244913" [ 1082.857212] env[67270]: _type = "Task" [ 1082.857212] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1082.869650] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1082.870837] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c6e10402-8710-4ef0-9fca-f6bf6776832d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.878909] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1082.879346] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Creating directory with path [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1082.879680] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a528165-3349-4327-beb4-ed823d1728ee {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.903719] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Created directory with path [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1082.904128] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Fetch image to [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1082.904442] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1082.905794] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c322dce2-2c92-4fab-ad34-104697f4b64b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.917690] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3efde58e-bab4-4d35-b2da-d86093720ea8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.933378] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e881083-d4ac-4bbb-b7a1-4b9c763fbcff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.968772] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e05c4ca-ea37-493d-975d-0c7418cc4efa {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.971482] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1082.971682] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1082.971854] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Deleting the datastore file [datastore1] eff1fe32-1755-4536-9ad9-286e1392a08d {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1082.972134] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ea2dafb6-3062-4288-92aa-495901f9b3ab {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.980250] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6b348c24-5787-47ed-9070-fae25ced36c2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1082.982183] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Waiting for the task: (returnval){ [ 1082.982183] env[67270]: value = "task-4110689" [ 1082.982183] env[67270]: _type = "Task" [ 1082.982183] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1083.005732] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1083.059996] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1083.121108] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1083.121322] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1083.151286] env[67270]: DEBUG nova.network.neutron [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1083.169393] env[67270]: INFO nova.compute.manager [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Took 0.34 seconds to deallocate network for instance. [ 1083.266331] env[67270]: INFO nova.scheduler.client.report [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Deleted allocations for instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 [ 1083.288701] env[67270]: DEBUG oslo_concurrency.lockutils [None req-5f3271d2-0a71-4ed4-9ffd-378ba34694cd tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 387.655s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.288975] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 188.064s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.289226] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Acquiring lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1083.289446] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.289620] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.291813] env[67270]: INFO nova.compute.manager [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Terminating instance [ 1083.293856] env[67270]: DEBUG nova.compute.manager [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1083.294080] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1083.294588] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a69c57b8-037f-471a-97ab-2f1849da1c02 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1083.304056] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac71d2d9-50fe-4759-a083-0eb7bf63256d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1083.333552] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a073c7a9-d7ee-4d9e-be23-4345ed5f9047 could not be found. [ 1083.333772] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1083.333958] env[67270]: INFO nova.compute.manager [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1083.334231] env[67270]: DEBUG oslo.service.loopingcall [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1083.334478] env[67270]: DEBUG nova.compute.manager [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1083.334611] env[67270]: DEBUG nova.network.neutron [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1083.359377] env[67270]: DEBUG nova.network.neutron [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1083.368099] env[67270]: INFO nova.compute.manager [-] [instance: a073c7a9-d7ee-4d9e-be23-4345ed5f9047] Took 0.03 seconds to deallocate network for instance. [ 1083.457967] env[67270]: DEBUG oslo_concurrency.lockutils [None req-43fda3f2-a4de-428d-a13e-b9d922afb412 tempest-ServerActionsTestJSON-159205297 tempest-ServerActionsTestJSON-159205297-project-member] Lock "a073c7a9-d7ee-4d9e-be23-4345ed5f9047" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.493185] env[67270]: DEBUG oslo_vmware.api [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Task: {'id': task-4110689, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.091763} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1083.493440] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1083.493630] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1083.493803] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1083.493976] env[67270]: INFO nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1083.496407] env[67270]: DEBUG nova.compute.claims [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1083.496407] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1083.496631] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.524845] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.525605] env[67270]: DEBUG nova.compute.utils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance eff1fe32-1755-4536-9ad9-286e1392a08d could not be found. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1083.528233] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Instance disappeared during build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1083.528402] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1083.528569] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1083.528815] env[67270]: DEBUG nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1083.529051] env[67270]: DEBUG nova.network.neutron [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1083.663320] env[67270]: DEBUG neutronclient.v2_0.client [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67270) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1083.667646] env[67270]: ERROR nova.compute.manager [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] result = getattr(controller, method)(*args, **kwargs) [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._get(image_id) [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1083.667646] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] resp, body = self.http_client.get(url, headers=header) [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.request(url, 'GET', **kwargs) [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._handle_response(resp) [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exc.from_response(resp, resp.content) [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] During handling of the above exception, another exception occurred: [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.668057] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.driver.spawn(context, instance, image_meta, [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._fetch_image_if_missing(context, vi) [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image_fetch(context, vi, tmp_image_ds_loc) [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] images.fetch_image( [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] metadata = IMAGE_API.get(context, image_ref) [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1083.668372] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return session.show(context, image_id, [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] _reraise_translated_image_exception(image_id) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise new_exc.with_traceback(exc_trace) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] result = getattr(controller, method)(*args, **kwargs) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._get(image_id) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1083.668746] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] resp, body = self.http_client.get(url, headers=header) [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.request(url, 'GET', **kwargs) [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._handle_response(resp) [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exc.from_response(resp, resp.content) [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] During handling of the above exception, another exception occurred: [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.669118] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._build_and_run_instance(context, instance, image, [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] with excutils.save_and_reraise_exception(): [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.force_reraise() [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise self.value [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] with self.rt.instance_claim(context, instance, node, allocs, [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.abort() [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1083.669471] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return f(*args, **kwargs) [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._unset_instance_host_and_node(instance) [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] instance.save() [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] updates, result = self.indirection_api.object_action( [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return cctxt.call(context, 'object_action', objinst=objinst, [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1083.669843] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] result = self.transport._send( [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._driver.send(target, ctxt, message, [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise result [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] nova.exception_Remote.InstanceNotFound_Remote: Instance eff1fe32-1755-4536-9ad9-286e1392a08d could not be found. [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return getattr(target, method)(*args, **kwargs) [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670224] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return fn(self, *args, **kwargs) [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] old_ref, inst_ref = db.instance_update_and_get_original( [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return f(*args, **kwargs) [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] with excutils.save_and_reraise_exception() as ectxt: [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.force_reraise() [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670588] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise self.value [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return f(*args, **kwargs) [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return f(context, *args, **kwargs) [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exception.InstanceNotFound(instance_id=uuid) [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.670994] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] nova.exception.InstanceNotFound: Instance eff1fe32-1755-4536-9ad9-286e1392a08d could not be found. [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] During handling of the above exception, another exception occurred: [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] exception_handler_v20(status_code, error_body) [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise client_exc(message=error_message, [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Neutron server returns request_ids: ['req-2b7ee669-ac4e-4152-8421-40530217d653'] [ 1083.671403] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] During handling of the above exception, another exception occurred: [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] Traceback (most recent call last): [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._deallocate_network(context, instance, requested_networks) [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self.network_api.deallocate_for_instance( [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] data = neutron.list_ports(**search_opts) [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1083.671743] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.list('ports', self.ports_path, retrieve_all, [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] for r in self._pagination(collection, path, **params): [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] res = self.get(path, params=params) [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.retry_request("GET", action, body=body, [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.672147] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] return self.do_request(method, action, body=body, [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] ret = obj(*args, **kwargs) [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] self._handle_fault_response(status_code, replybody, resp) [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] raise exception.Unauthorized() [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] nova.exception.Unauthorized: Not authorized. [ 1083.672532] env[67270]: ERROR nova.compute.manager [instance: eff1fe32-1755-4536-9ad9-286e1392a08d] [ 1083.690296] env[67270]: DEBUG oslo_concurrency.lockutils [None req-aa8da924-eb80-4512-bd3c-9abcf4da4ae3 tempest-ServersAdminTestJSON-662343238 tempest-ServersAdminTestJSON-662343238-project-member] Lock "eff1fe32-1755-4536-9ad9-286e1392a08d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 387.262s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.076126] env[67270]: WARNING oslo_vmware.rw_handles [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1098.076126] env[67270]: ERROR oslo_vmware.rw_handles [ 1098.076774] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1098.079146] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1098.079392] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Copying Virtual Disk [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore2] vmware_temp/4304b835-cabe-4383-bd79-1e6cb09a6063/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1098.079703] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-77f9c920-c9ce-45ad-b94c-8cc1243fd97c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.089669] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for the task: (returnval){ [ 1098.089669] env[67270]: value = "task-4110690" [ 1098.089669] env[67270]: _type = "Task" [ 1098.089669] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.098908] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Task: {'id': task-4110690, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.600190] env[67270]: DEBUG oslo_vmware.exceptions [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1098.600457] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1098.601036] env[67270]: ERROR nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.601036] env[67270]: Faults: ['InvalidArgument'] [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Traceback (most recent call last): [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] yield resources [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self.driver.spawn(context, instance, image_meta, [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self._fetch_image_if_missing(context, vi) [ 1098.601036] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] image_cache(vi, tmp_image_ds_loc) [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] vm_util.copy_virtual_disk( [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] session._wait_for_task(vmdk_copy_task) [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return self.wait_for_task(task_ref) [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return evt.wait() [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] result = hub.switch() [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1098.601416] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return self.greenlet.switch() [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self.f(*self.args, **self.kw) [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] raise exceptions.translate_fault(task_info.error) [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Faults: ['InvalidArgument'] [ 1098.601769] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] [ 1098.601769] env[67270]: INFO nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Terminating instance [ 1098.603011] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquired lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1098.603237] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1098.603709] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1098.603865] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquired lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1098.604049] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1098.605174] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-723d040d-57c4-4584-9c73-fbfed99a1713 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.615343] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1098.615535] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1098.616828] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fb7a303e-d2ab-4194-9d9d-272d90d1c40d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.622686] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Waiting for the task: (returnval){ [ 1098.622686] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]522aa2e4-e513-9471-fc0f-c1ceabe39532" [ 1098.622686] env[67270]: _type = "Task" [ 1098.622686] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.633319] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]522aa2e4-e513-9471-fc0f-c1ceabe39532, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.637668] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1098.806015] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.815254] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Releasing lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1098.818025] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1098.818025] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1098.818025] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68d0f8a3-60a2-4b90-bea0-5b89389aca94 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.827248] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1098.827248] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-56a52f4f-d3ec-4c9e-9e4e-ad3a556bb6b0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.869024] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1098.869024] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Deleting contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1098.869024] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Deleting the datastore file [datastore2] 39ead031-10c5-40e3-ba91-9d34334398f3 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1098.869024] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c21326b6-2627-44ae-a014-0d7b37ba1cec {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.878571] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for the task: (returnval){ [ 1098.878571] env[67270]: value = "task-4110692" [ 1098.878571] env[67270]: _type = "Task" [ 1098.878571] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.887503] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Task: {'id': task-4110692, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1099.132619] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1099.133021] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Creating directory with path [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1099.133409] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99aab414-8b88-415e-ad9f-14a026ecc52e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.146377] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Created directory with path [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1099.146594] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Fetch image to [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1099.146767] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1099.147561] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66d0980c-561f-404f-b650-22b252738550 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.156321] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f9f2f62-ce4c-4779-aaef-1bed3639f498 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.167120] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88c9657e-fe50-4323-84a7-0d3c17cc3392 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.202063] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdf71521-bc1d-4754-8686-a60487c974ca {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.208883] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4d919d03-2e64-4424-8811-1a0634ef6960 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.231690] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1099.279429] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1099.343812] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1099.344465] env[67270]: DEBUG oslo_vmware.rw_handles [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1099.388213] env[67270]: DEBUG oslo_vmware.api [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Task: {'id': task-4110692, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.03958} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1099.389288] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1099.389288] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Deleted contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1099.389288] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1099.389288] env[67270]: INFO nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Took 0.57 seconds to destroy the instance on the hypervisor. [ 1099.389499] env[67270]: DEBUG oslo.service.loopingcall [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1099.389558] env[67270]: DEBUG nova.compute.manager [-] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1099.392901] env[67270]: DEBUG nova.compute.claims [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1099.393480] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1099.393480] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1099.529066] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b965e0-a103-48a8-9d8b-962d7a4b9bfb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.537099] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-790a9ed9-8bf6-42d2-98e5-05c64ceeb92f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.572641] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-957acce7-46d5-41a0-aa94-9a0e41d9d16e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.583576] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f293b4c3-7e56-48a8-83f8-56dace531362 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.598766] env[67270]: DEBUG nova.compute.provider_tree [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1099.607805] env[67270]: DEBUG nova.scheduler.client.report [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1099.627195] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.233s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.627377] env[67270]: ERROR nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.627377] env[67270]: Faults: ['InvalidArgument'] [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Traceback (most recent call last): [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self.driver.spawn(context, instance, image_meta, [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self._fetch_image_if_missing(context, vi) [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] image_cache(vi, tmp_image_ds_loc) [ 1099.627377] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] vm_util.copy_virtual_disk( [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] session._wait_for_task(vmdk_copy_task) [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return self.wait_for_task(task_ref) [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return evt.wait() [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] result = hub.switch() [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] return self.greenlet.switch() [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1099.627883] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] self.f(*self.args, **self.kw) [ 1099.628392] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1099.628392] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] raise exceptions.translate_fault(task_info.error) [ 1099.628392] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.628392] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Faults: ['InvalidArgument'] [ 1099.628392] env[67270]: ERROR nova.compute.manager [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] [ 1099.628392] env[67270]: DEBUG nova.compute.utils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1099.630526] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Build of instance 39ead031-10c5-40e3-ba91-9d34334398f3 was re-scheduled: A specified parameter was not correct: fileType [ 1099.630526] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1099.630925] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1099.631178] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquiring lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1099.631332] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Acquired lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1099.631489] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1099.885987] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1099.989222] env[67270]: DEBUG nova.network.neutron [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1099.999081] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Releasing lock "refresh_cache-39ead031-10c5-40e3-ba91-9d34334398f3" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1099.999797] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1099.999797] env[67270]: DEBUG nova.compute.manager [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] [instance: 39ead031-10c5-40e3-ba91-9d34334398f3] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1100.117664] env[67270]: INFO nova.scheduler.client.report [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Deleted allocations for instance 39ead031-10c5-40e3-ba91-9d34334398f3 [ 1100.135389] env[67270]: DEBUG oslo_concurrency.lockutils [None req-13c66f99-e39c-44c3-9e5b-dc0f6fe6d29e tempest-ServerShowV257Test-1553708197 tempest-ServerShowV257Test-1553708197-project-member] Lock "39ead031-10c5-40e3-ba91-9d34334398f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 97.496s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1114.757658] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.757970] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.758104] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1115.758799] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.759220] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1115.759220] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1115.772873] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1115.773052] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1115.773254] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1115.773442] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1115.773595] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1115.774221] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.774221] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.775015] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.783953] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1115.784106] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1115.784279] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1115.784435] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1115.785507] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ccb2a2a-3777-43bd-882a-65b70b56026c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.794965] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce4b5dc-844f-40c0-98a6-a8f8ee323397 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.809057] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f6d598-f518-4e38-87a4-e528813d8e1b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.816124] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b40e0f20-5b4e-4b06-8e7d-9208a5754fb3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.847198] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180774MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1115.847360] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1115.847564] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1115.894898] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1115.895091] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1115.895225] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1115.895350] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65509bc1-a140-416a-a465-4c9e6efce4a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1115.895535] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1115.895674] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1115.956491] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c2ea36b-4a8c-4a62-97a0-c1adc99afa98 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.964626] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75b0282a-06d4-46c7-976e-c04a4ac08596 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1115.997290] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d307697-7805-4574-80af-5aa41699483f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1116.005360] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2643045f-49bd-450d-8dba-e88b7e60e70e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1116.020823] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1116.028841] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1116.041932] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1116.042124] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1118.025921] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1118.025921] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1118.758811] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.753853] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1133.090910] env[67270]: WARNING oslo_vmware.rw_handles [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1133.090910] env[67270]: ERROR oslo_vmware.rw_handles [ 1133.091931] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1133.093378] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1133.093627] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Copying Virtual Disk [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/9c672b4f-38e1-4470-a576-2fe8fa0dcedc/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1133.093924] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c81ffc70-67cd-4527-bb59-9af053ea8a4b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.102413] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for the task: (returnval){ [ 1133.102413] env[67270]: value = "task-4110693" [ 1133.102413] env[67270]: _type = "Task" [ 1133.102413] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1133.110728] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Task: {'id': task-4110693, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1133.612965] env[67270]: DEBUG oslo_vmware.exceptions [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1133.613249] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1133.613802] env[67270]: ERROR nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1133.613802] env[67270]: Faults: ['InvalidArgument'] [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Traceback (most recent call last): [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] yield resources [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self.driver.spawn(context, instance, image_meta, [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self._fetch_image_if_missing(context, vi) [ 1133.613802] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] image_cache(vi, tmp_image_ds_loc) [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] vm_util.copy_virtual_disk( [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] session._wait_for_task(vmdk_copy_task) [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return self.wait_for_task(task_ref) [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return evt.wait() [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] result = hub.switch() [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1133.614358] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return self.greenlet.switch() [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self.f(*self.args, **self.kw) [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] raise exceptions.translate_fault(task_info.error) [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Faults: ['InvalidArgument'] [ 1133.614681] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] [ 1133.614681] env[67270]: INFO nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Terminating instance [ 1133.615768] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1133.615985] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1133.616241] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-394d5f84-9836-4623-bd3a-2b0cf04dde04 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.618759] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1133.618950] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1133.619700] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b06bb978-6fcc-43ab-a678-6335f3483729 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.627036] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1133.627277] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f94ac54f-fe9c-4955-a407-6b900e3fc67e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.629623] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1133.629802] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1133.630821] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-385eb6bf-d54f-4af9-8aa6-654f19889221 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.636098] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for the task: (returnval){ [ 1133.636098] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52922799-f439-6139-d3aa-68b6ed885fcc" [ 1133.636098] env[67270]: _type = "Task" [ 1133.636098] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1133.643634] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52922799-f439-6139-d3aa-68b6ed885fcc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1133.706211] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1133.706455] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1133.706697] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Deleting the datastore file [datastore1] 5d61c322-6a7d-4991-8cc4-6dcb1be74256 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1133.706998] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-644776c0-debe-4364-8aad-dea172d27f78 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.714215] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for the task: (returnval){ [ 1133.714215] env[67270]: value = "task-4110695" [ 1133.714215] env[67270]: _type = "Task" [ 1133.714215] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1133.722312] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Task: {'id': task-4110695, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1134.146443] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1134.146831] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Creating directory with path [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1134.146957] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2ac086e-4f44-42dc-a25f-99eb47ad1d7d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.159456] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Created directory with path [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1134.159657] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Fetch image to [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1134.159816] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1134.160616] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfedbc1b-3f6f-4ac9-b910-7a94f0133960 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.167953] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-889d3d17-510c-4e3e-9b5b-bbf3ea02ad98 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.177524] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cf77f61-08a5-4f99-9ad6-f557b5f9f11c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.209765] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ece8b4f-ac39-48e9-bb4a-b2910b28f399 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.218706] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5dc47fa5-7b56-44aa-8e24-d3e5dd5bc12b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.225120] env[67270]: DEBUG oslo_vmware.api [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Task: {'id': task-4110695, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082034} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1134.225367] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1134.225566] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1134.225739] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1134.225925] env[67270]: INFO nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1134.228074] env[67270]: DEBUG nova.compute.claims [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1134.228248] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1134.228508] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1134.241268] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1134.287271] env[67270]: DEBUG oslo_vmware.rw_handles [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1134.345448] env[67270]: DEBUG oslo_vmware.rw_handles [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1134.345624] env[67270]: DEBUG oslo_vmware.rw_handles [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1134.381259] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc0a9148-a15e-4fac-af83-124cbd0f2de3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.390015] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1bb6c84-722b-4c13-bffd-32fda99b45bc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.420855] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda70803-c4b9-4a85-bc6a-3c38432a4930 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.428945] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b107bb9-dcbf-4335-9e1a-ca6eb448e0c1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.442790] env[67270]: DEBUG nova.compute.provider_tree [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1134.451477] env[67270]: DEBUG nova.scheduler.client.report [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1134.464565] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.236s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1134.465153] env[67270]: ERROR nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1134.465153] env[67270]: Faults: ['InvalidArgument'] [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Traceback (most recent call last): [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self.driver.spawn(context, instance, image_meta, [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self._fetch_image_if_missing(context, vi) [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] image_cache(vi, tmp_image_ds_loc) [ 1134.465153] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] vm_util.copy_virtual_disk( [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] session._wait_for_task(vmdk_copy_task) [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return self.wait_for_task(task_ref) [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return evt.wait() [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] result = hub.switch() [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] return self.greenlet.switch() [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1134.465484] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] self.f(*self.args, **self.kw) [ 1134.465795] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1134.465795] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] raise exceptions.translate_fault(task_info.error) [ 1134.465795] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1134.465795] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Faults: ['InvalidArgument'] [ 1134.465795] env[67270]: ERROR nova.compute.manager [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] [ 1134.465913] env[67270]: DEBUG nova.compute.utils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1134.467604] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Build of instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 was re-scheduled: A specified parameter was not correct: fileType [ 1134.467604] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1134.467977] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1134.468166] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1134.468320] env[67270]: DEBUG nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1134.468482] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1134.743483] env[67270]: DEBUG nova.network.neutron [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1134.754745] env[67270]: INFO nova.compute.manager [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Took 0.29 seconds to deallocate network for instance. [ 1134.844039] env[67270]: INFO nova.scheduler.client.report [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Deleted allocations for instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 [ 1134.862245] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3d393268-b213-4312-8fa0-8e0a1c6cc59b tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 437.739s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1134.862510] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 235.990s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1134.862760] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Acquiring lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1134.862970] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1134.863154] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1134.865491] env[67270]: INFO nova.compute.manager [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Terminating instance [ 1134.867261] env[67270]: DEBUG nova.compute.manager [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1134.867504] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1134.868028] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9e66c1a6-bb9d-4519-a5be-1261f62ad072 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.877866] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a7af8d5-ccbb-4e01-9f81-5e37ad9f4210 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1134.904974] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5d61c322-6a7d-4991-8cc4-6dcb1be74256 could not be found. [ 1134.905207] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1134.905391] env[67270]: INFO nova.compute.manager [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1134.905630] env[67270]: DEBUG oslo.service.loopingcall [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1134.906148] env[67270]: DEBUG nova.compute.manager [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1134.906252] env[67270]: DEBUG nova.network.neutron [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1134.929435] env[67270]: DEBUG nova.network.neutron [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1134.938389] env[67270]: INFO nova.compute.manager [-] [instance: 5d61c322-6a7d-4991-8cc4-6dcb1be74256] Took 0.03 seconds to deallocate network for instance. [ 1135.026861] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c33d5111-b85b-4f1b-8921-aea659a02275 tempest-MigrationsAdminTest-811059248 tempest-MigrationsAdminTest-811059248-project-member] Lock "5d61c322-6a7d-4991-8cc4-6dcb1be74256" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.164s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.095712] env[67270]: WARNING oslo_vmware.rw_handles [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1148.095712] env[67270]: ERROR oslo_vmware.rw_handles [ 1148.096645] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore2 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1148.097982] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1148.098532] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Copying Virtual Disk [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore2] vmware_temp/f860ce6f-4a3f-48fa-acc9-f80b93cc92a2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1148.098832] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-909b82af-8e2a-4ceb-b90d-d0301035d35e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.107812] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Waiting for the task: (returnval){ [ 1148.107812] env[67270]: value = "task-4110696" [ 1148.107812] env[67270]: _type = "Task" [ 1148.107812] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1148.117532] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Task: {'id': task-4110696, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.618862] env[67270]: DEBUG oslo_vmware.exceptions [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1148.619117] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Releasing lock "[datastore2] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1148.619703] env[67270]: ERROR nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.619703] env[67270]: Faults: ['InvalidArgument'] [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Traceback (most recent call last): [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] yield resources [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self.driver.spawn(context, instance, image_meta, [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self._fetch_image_if_missing(context, vi) [ 1148.619703] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] image_cache(vi, tmp_image_ds_loc) [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] vm_util.copy_virtual_disk( [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] session._wait_for_task(vmdk_copy_task) [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return self.wait_for_task(task_ref) [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return evt.wait() [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] result = hub.switch() [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1148.620286] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return self.greenlet.switch() [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self.f(*self.args, **self.kw) [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] raise exceptions.translate_fault(task_info.error) [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Faults: ['InvalidArgument'] [ 1148.620686] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] [ 1148.620686] env[67270]: INFO nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Terminating instance [ 1148.622562] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1148.622765] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1148.623602] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d04e19b3-0291-45de-9aeb-9dc6c0158a6c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.631027] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1148.631273] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-11957bc7-8adb-42e9-8f64-62ffce8306bc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.277640] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1149.277948] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Deleting contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1149.278216] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Deleting the datastore file [datastore2] 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1149.278704] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bd165b65-4048-4e22-b5a4-5eeaae5dd154 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.286446] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Waiting for the task: (returnval){ [ 1149.286446] env[67270]: value = "task-4110698" [ 1149.286446] env[67270]: _type = "Task" [ 1149.286446] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1149.295160] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Task: {'id': task-4110698, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1149.797880] env[67270]: DEBUG oslo_vmware.api [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Task: {'id': task-4110698, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075951} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1149.798150] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1149.798334] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Deleted contents of the VM from datastore datastore2 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1149.798505] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1149.798683] env[67270]: INFO nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Took 1.18 seconds to destroy the instance on the hypervisor. [ 1149.800805] env[67270]: DEBUG nova.compute.claims [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1149.801015] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.801249] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.887306] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fe52fa1-bab5-43b8-9b42-5db98d3ecf40 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.895387] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3cdcb56-79db-440a-b3d1-8abcdacde98d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.926334] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c96d3826-f37e-4553-be76-13ab7de494a7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.934642] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-271bf552-6f04-4f32-9516-d1d7ed7f7e3e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.948717] env[67270]: DEBUG nova.compute.provider_tree [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1149.957736] env[67270]: DEBUG nova.scheduler.client.report [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1149.971829] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.170s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.972373] env[67270]: ERROR nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1149.972373] env[67270]: Faults: ['InvalidArgument'] [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Traceback (most recent call last): [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self.driver.spawn(context, instance, image_meta, [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self._fetch_image_if_missing(context, vi) [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] image_cache(vi, tmp_image_ds_loc) [ 1149.972373] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] vm_util.copy_virtual_disk( [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] session._wait_for_task(vmdk_copy_task) [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return self.wait_for_task(task_ref) [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return evt.wait() [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] result = hub.switch() [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] return self.greenlet.switch() [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1149.972754] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] self.f(*self.args, **self.kw) [ 1149.973176] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1149.973176] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] raise exceptions.translate_fault(task_info.error) [ 1149.973176] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1149.973176] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Faults: ['InvalidArgument'] [ 1149.973176] env[67270]: ERROR nova.compute.manager [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] [ 1149.973176] env[67270]: DEBUG nova.compute.utils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1149.974599] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Build of instance 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19 was re-scheduled: A specified parameter was not correct: fileType [ 1149.974599] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1149.975008] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1149.975199] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1149.975378] env[67270]: DEBUG nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1149.975533] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1150.289194] env[67270]: DEBUG nova.network.neutron [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1150.301947] env[67270]: INFO nova.compute.manager [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] [instance: 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19] Took 0.32 seconds to deallocate network for instance. [ 1150.387141] env[67270]: INFO nova.scheduler.client.report [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Deleted allocations for instance 972c064e-2a9f-4afb-95b6-f6dd6b8a7a19 [ 1150.410185] env[67270]: DEBUG oslo_concurrency.lockutils [None req-3292e670-b7d7-47be-96dd-e69bff8587fb tempest-ServerAddressesTestJSON-1065680081 tempest-ServerAddressesTestJSON-1065680081-project-member] Lock "972c064e-2a9f-4afb-95b6-f6dd6b8a7a19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 150.342s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1174.758319] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.758797] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1174.758797] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1175.758477] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1175.758874] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1175.758874] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1175.771970] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1175.772170] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1175.772305] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1175.772772] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1175.772942] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1175.782815] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1175.783066] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1175.783236] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1175.783403] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1175.784518] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f474c0b-aea3-41b3-b649-9c103a73fa7a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.793787] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37befee2-ba5b-4015-809d-af3a17a2b166 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.810417] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a6aa1a-aef5-4833-b418-b754a6989fd1 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.818465] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c3a5570-138b-49cb-b87c-5a01a01d0de7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.850820] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180782MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1175.851029] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1175.851256] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1175.899321] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1175.899495] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 65509bc1-a140-416a-a465-4c9e6efce4a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1175.899693] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1175.899836] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1175.942873] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f02938c-d5b1-42d2-8d6a-71834c94de56 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.950520] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a0bb88d-1b7c-4f9b-828d-9b05f1b1978c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.983436] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25dd43d7-1099-495a-b783-024c4e16392a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1175.991729] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d631f69e-6e9b-4851-883d-187f27793e30 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1176.005802] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1176.015398] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1176.029455] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1176.029707] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1178.015200] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.754033] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.757887] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.757887] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1183.103626] env[67270]: WARNING oslo_vmware.rw_handles [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1183.103626] env[67270]: ERROR oslo_vmware.rw_handles [ 1183.104569] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1183.106227] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1183.106480] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Copying Virtual Disk [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/864a8815-1633-4869-97ef-b54e6ac7d21f/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1183.106748] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-578ed167-c2ff-43aa-aa5a-868fd09c3551 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.116679] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for the task: (returnval){ [ 1183.116679] env[67270]: value = "task-4110699" [ 1183.116679] env[67270]: _type = "Task" [ 1183.116679] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.125150] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Task: {'id': task-4110699, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1183.627583] env[67270]: DEBUG oslo_vmware.exceptions [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1183.627841] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1183.628439] env[67270]: ERROR nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1183.628439] env[67270]: Faults: ['InvalidArgument'] [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Traceback (most recent call last): [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] yield resources [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self.driver.spawn(context, instance, image_meta, [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self._fetch_image_if_missing(context, vi) [ 1183.628439] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] image_cache(vi, tmp_image_ds_loc) [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] vm_util.copy_virtual_disk( [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] session._wait_for_task(vmdk_copy_task) [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return self.wait_for_task(task_ref) [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return evt.wait() [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] result = hub.switch() [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1183.628984] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return self.greenlet.switch() [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self.f(*self.args, **self.kw) [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] raise exceptions.translate_fault(task_info.error) [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Faults: ['InvalidArgument'] [ 1183.629332] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] [ 1183.629332] env[67270]: INFO nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Terminating instance [ 1183.630334] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1183.630540] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1183.630769] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9aae8c30-deb2-4b14-be0f-e4e194ac4fd0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.633070] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1183.633177] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1183.633319] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1183.640491] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1183.640662] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1183.641845] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2f8a68b2-cca9-4d40-bd03-00f3e3e00bbd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.649798] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Waiting for the task: (returnval){ [ 1183.649798] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52fd6993-2f59-3a8c-3d68-2ccaad0bbe14" [ 1183.649798] env[67270]: _type = "Task" [ 1183.649798] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.657784] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52fd6993-2f59-3a8c-3d68-2ccaad0bbe14, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1183.662136] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1183.722026] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1183.730678] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Releasing lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1183.731088] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1183.731282] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1183.732416] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f44db8b5-63d0-413e-8393-d368f70d670b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.740377] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1183.740598] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-aa63e8c0-c123-4616-b111-fd89bc0c568c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.772031] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1183.772279] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1183.772462] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Deleting the datastore file [datastore1] 8b43a9a6-b28c-43ed-9f83-02424f73dc3c {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1183.772726] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7674196f-c23f-4555-ac1c-8ebf354df06f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.779299] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for the task: (returnval){ [ 1183.779299] env[67270]: value = "task-4110701" [ 1183.779299] env[67270]: _type = "Task" [ 1183.779299] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.788771] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Task: {'id': task-4110701, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1184.160649] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1184.161060] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Creating directory with path [datastore1] vmware_temp/4b247e9a-160b-41ac-adff-e367c6637ae4/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1184.161930] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6af07ff8-26df-4149-8520-53846eaa7723 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.172815] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Created directory with path [datastore1] vmware_temp/4b247e9a-160b-41ac-adff-e367c6637ae4/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1184.173026] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Fetch image to [datastore1] vmware_temp/4b247e9a-160b-41ac-adff-e367c6637ae4/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1184.173205] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/4b247e9a-160b-41ac-adff-e367c6637ae4/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1184.174012] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5f4421-9051-441d-bc14-82bec0b83917 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.181319] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b50f24a1-6402-48c8-9af9-a305e208937a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.190625] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a827553b-966c-4a3b-ab13-afb0ccac3475 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.222160] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e4d9bad-a4b7-4a52-8fbc-1e13e0f9a5a0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.228423] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b5a9b6e3-7eeb-4c70-bfdf-13c2b814403f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.249123] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1184.291931] env[67270]: DEBUG oslo_vmware.api [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Task: {'id': task-4110701, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.04487} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1184.292340] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1184.292601] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1184.292790] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1184.292966] env[67270]: INFO nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1184.293226] env[67270]: DEBUG oslo.service.loopingcall [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1184.293443] env[67270]: DEBUG nova.compute.manager [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1184.295679] env[67270]: DEBUG nova.compute.claims [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1184.295845] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.296070] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.367671] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22c7b2f1-b004-48a3-9673-9cbb1d642a63 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.373902] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be46d4fc-5f27-4d60-8ead-5bd6eecf0b00 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.407971] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3db7195-f220-4319-aa69-02546b8ab55c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.416420] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c50e8039-8d71-4dd1-a480-bae5735a6466 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.431782] env[67270]: DEBUG nova.compute.provider_tree [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1184.440559] env[67270]: DEBUG nova.scheduler.client.report [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1184.453975] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.158s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.454571] env[67270]: ERROR nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1184.454571] env[67270]: Faults: ['InvalidArgument'] [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Traceback (most recent call last): [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self.driver.spawn(context, instance, image_meta, [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self._fetch_image_if_missing(context, vi) [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] image_cache(vi, tmp_image_ds_loc) [ 1184.454571] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] vm_util.copy_virtual_disk( [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] session._wait_for_task(vmdk_copy_task) [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return self.wait_for_task(task_ref) [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return evt.wait() [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] result = hub.switch() [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] return self.greenlet.switch() [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1184.454994] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] self.f(*self.args, **self.kw) [ 1184.455307] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1184.455307] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] raise exceptions.translate_fault(task_info.error) [ 1184.455307] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1184.455307] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Faults: ['InvalidArgument'] [ 1184.455307] env[67270]: ERROR nova.compute.manager [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] [ 1184.455430] env[67270]: DEBUG nova.compute.utils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1184.460121] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Build of instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c was re-scheduled: A specified parameter was not correct: fileType [ 1184.460121] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1184.460121] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1184.460121] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1184.460393] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1184.460393] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1184.462525] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.463299] env[67270]: ERROR nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Traceback (most recent call last): [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] result = getattr(controller, method)(*args, **kwargs) [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self._get(image_id) [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.463299] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] resp, body = self.http_client.get(url, headers=header) [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self.request(url, 'GET', **kwargs) [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self._handle_response(resp) [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] raise exc.from_response(resp, resp.content) [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] During handling of the above exception, another exception occurred: [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] [ 1184.463641] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Traceback (most recent call last): [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] yield resources [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] self.driver.spawn(context, instance, image_meta, [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] self._fetch_image_if_missing(context, vi) [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] image_fetch(context, vi, tmp_image_ds_loc) [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] images.fetch_image( [ 1184.463931] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] metadata = IMAGE_API.get(context, image_ref) [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return session.show(context, image_id, [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] _reraise_translated_image_exception(image_id) [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] raise new_exc.with_traceback(exc_trace) [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] result = getattr(controller, method)(*args, **kwargs) [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.464274] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self._get(image_id) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] resp, body = self.http_client.get(url, headers=header) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self.request(url, 'GET', **kwargs) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] return self._handle_response(resp) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] raise exc.from_response(resp, resp.content) [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1184.464601] env[67270]: ERROR nova.compute.manager [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] [ 1184.464903] env[67270]: INFO nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Terminating instance [ 1184.465104] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1184.465323] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1184.466152] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1184.466313] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquired lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1184.466482] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1184.467370] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5b3fff5-d13d-447d-bb9c-f0af1e78e75e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.473806] env[67270]: DEBUG nova.compute.utils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Can not refresh info_cache because instance was not found {{(pid=67270) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1184.480049] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1184.480049] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1184.480209] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e496575b-8491-4aa2-8ed8-9e9591f2c72a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.486232] env[67270]: DEBUG oslo_vmware.api [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Waiting for the task: (returnval){ [ 1184.486232] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52869606-8d38-1960-9a1f-b1da1d4b5657" [ 1184.486232] env[67270]: _type = "Task" [ 1184.486232] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1184.490153] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1184.492728] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1184.501570] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1184.501867] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Creating directory with path [datastore1] vmware_temp/b01abfae-ec72-4b16-9111-2e8b18ae53e2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1184.502122] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9e889775-43e5-4b86-93fa-bec582ce8949 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.523468] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Created directory with path [datastore1] vmware_temp/b01abfae-ec72-4b16-9111-2e8b18ae53e2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1184.523709] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Fetch image to [datastore1] vmware_temp/b01abfae-ec72-4b16-9111-2e8b18ae53e2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1184.523792] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/b01abfae-ec72-4b16-9111-2e8b18ae53e2/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1184.524666] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc71ed22-1ffe-49b3-9637-7e990235bb17 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.534409] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56913d07-9c99-4c54-8fc2-938768f43af9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.545269] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5fe17a5-a132-4f12-a848-b8ab09a03beb {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.579116] env[67270]: DEBUG nova.network.neutron [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1184.580948] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1184.582238] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f50842c-acc1-4613-86a4-1d1095895419 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.587694] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Releasing lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.587847] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1184.588034] env[67270]: DEBUG nova.compute.manager [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1184.592061] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Releasing lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.592423] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1184.592610] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1184.593926] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-364aa121-b98e-4084-bf1a-90aa74e34080 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.596817] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a5e0bb65-7f84-4e0f-bcc3-95d07ab10726 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.603207] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1184.603419] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c1defb7b-b489-4b33-8bd2-4f379a149968 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.625675] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1184.631973] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1184.632200] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1184.632390] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Deleting the datastore file [datastore1] 8ddc70e6-ec6f-4740-8109-6ba2c5d00536 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1184.632912] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-75ec2e6f-bf7a-4fa3-b2e8-5d4f53757c29 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.640306] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Waiting for the task: (returnval){ [ 1184.640306] env[67270]: value = "task-4110703" [ 1184.640306] env[67270]: _type = "Task" [ 1184.640306] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1184.649608] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Task: {'id': task-4110703, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1184.678690] env[67270]: INFO nova.scheduler.client.report [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Deleted allocations for instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c [ 1184.698606] env[67270]: DEBUG oslo_concurrency.lockutils [None req-28e5b159-6953-409f-8632-aebe64624312 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 486.174s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.698606] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 285.232s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.698750] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.698894] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.699074] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.700940] env[67270]: INFO nova.compute.manager [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Terminating instance [ 1184.703033] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquiring lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1184.703525] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Acquired lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1184.703525] env[67270]: DEBUG nova.network.neutron [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1184.730537] env[67270]: DEBUG nova.network.neutron [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1184.732769] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.733529] env[67270]: ERROR nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] result = getattr(controller, method)(*args, **kwargs) [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._get(image_id) [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.733529] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] resp, body = self.http_client.get(url, headers=header) [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.request(url, 'GET', **kwargs) [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._handle_response(resp) [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exc.from_response(resp, resp.content) [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] During handling of the above exception, another exception occurred: [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1184.734104] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] yield resources [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.driver.spawn(context, instance, image_meta, [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._fetch_image_if_missing(context, vi) [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image_fetch(context, vi, tmp_image_ds_loc) [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] images.fetch_image( [ 1184.734397] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] metadata = IMAGE_API.get(context, image_ref) [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return session.show(context, image_id, [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] _reraise_translated_image_exception(image_id) [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise new_exc.with_traceback(exc_trace) [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] result = getattr(controller, method)(*args, **kwargs) [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.734727] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._get(image_id) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] resp, body = self.http_client.get(url, headers=header) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.request(url, 'GET', **kwargs) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._handle_response(resp) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exc.from_response(resp, resp.content) [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1184.735061] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1184.735347] env[67270]: INFO nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Terminating instance [ 1184.735347] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1184.735713] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1184.735713] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9949a4d-14b3-4f46-bebd-261f694d9b4e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.738071] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1184.738263] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1184.739039] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a3b3dd5-67c6-40cc-aa79-97277369b688 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.748740] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1184.749771] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2edf4fc0-d196-4226-8b0a-141b1a6cbb5f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.751361] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1184.751534] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1184.752249] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-35d44a4b-790b-4e43-adf6-bd0c0b6e76e4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.757299] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Waiting for the task: (returnval){ [ 1184.757299] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52b2b1a8-b53d-a218-c1a7-43d5f9aa37f2" [ 1184.757299] env[67270]: _type = "Task" [ 1184.757299] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1184.765667] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52b2b1a8-b53d-a218-c1a7-43d5f9aa37f2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1184.791035] env[67270]: DEBUG nova.network.neutron [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1184.800651] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Releasing lock "refresh_cache-8b43a9a6-b28c-43ed-9f83-02424f73dc3c" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.801142] env[67270]: DEBUG nova.compute.manager [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1184.801362] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1184.801900] env[67270]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23a7a083-0459-4ab2-93b6-eecdc3ada269 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.811227] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f45d57b3-7229-4188-9e20-6d6e6dedac0e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.830356] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1184.830631] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1184.830775] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Deleting the datastore file [datastore1] 2de499d5-2eb3-4138-8c6b-41fb94ff27eb {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1184.830993] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b3e892ab-f261-471b-818f-8b4065630ae7 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.524508] env[67270]: WARNING nova.virt.vmwareapi.vmops [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8b43a9a6-b28c-43ed-9f83-02424f73dc3c could not be found. [ 1185.524995] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1185.524995] env[67270]: INFO nova.compute.manager [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Took 0.72 seconds to destroy the instance on the hypervisor. [ 1185.525217] env[67270]: DEBUG oslo.service.loopingcall [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1185.531959] env[67270]: DEBUG nova.compute.manager [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1185.532081] env[67270]: DEBUG nova.network.neutron [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1185.533690] env[67270]: DEBUG oslo_vmware.api [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Waiting for the task: (returnval){ [ 1185.533690] env[67270]: value = "task-4110705" [ 1185.533690] env[67270]: _type = "Task" [ 1185.533690] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1185.543864] env[67270]: DEBUG oslo_vmware.api [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Task: {'id': task-4110703, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.036256} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1185.545297] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1185.545481] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1185.545652] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1185.545821] env[67270]: INFO nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Took 0.95 seconds to destroy the instance on the hypervisor. [ 1185.546071] env[67270]: DEBUG oslo.service.loopingcall [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1185.546280] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1185.546488] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Creating directory with path [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1185.546724] env[67270]: DEBUG nova.compute.manager [-] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1185.546884] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-66c2ab44-d530-4985-a35a-f2554b2fe0e3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.552542] env[67270]: DEBUG nova.network.neutron [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1185.557017] env[67270]: DEBUG nova.compute.claims [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1185.557221] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1185.557431] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1185.560341] env[67270]: DEBUG oslo_vmware.api [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Task: {'id': task-4110705, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068145} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1185.560671] env[67270]: DEBUG nova.network.neutron [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1185.561687] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1185.561920] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1185.562059] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1185.562228] env[67270]: INFO nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Took 0.82 seconds to destroy the instance on the hypervisor. [ 1185.564035] env[67270]: DEBUG nova.compute.claims [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1185.564208] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1185.565350] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Created directory with path [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1185.565527] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Fetch image to [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1185.565692] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1185.566618] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36f35dfb-a544-4339-99ee-c3c6c6fa1487 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.569600] env[67270]: INFO nova.compute.manager [-] [instance: 8b43a9a6-b28c-43ed-9f83-02424f73dc3c] Took 0.04 seconds to deallocate network for instance. [ 1185.577165] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a5b6959-fff4-42ad-aae6-ab502b81a71a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.586816] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-679c2573-d6de-4670-b265-8f8b7b959948 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.618598] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.061s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.619355] env[67270]: DEBUG nova.compute.utils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance 8ddc70e6-ec6f-4740-8109-6ba2c5d00536 could not be found. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1185.623241] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebf5d1dd-57f5-4829-981e-01b0e62a0f21 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.625934] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance disappeared during build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1185.626112] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1185.626332] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquiring lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1185.626473] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Acquired lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1185.626628] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1185.627777] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.063s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1185.636660] env[67270]: DEBUG nova.compute.utils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Can not refresh info_cache because instance was not found {{(pid=67270) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1185.639901] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2e6b0212-a781-4235-8ea1-908391bc951b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.655383] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.656115] env[67270]: DEBUG nova.compute.utils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb could not be found. {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1185.657765] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance disappeared during build. {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1185.657765] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1185.657888] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1185.658065] env[67270]: DEBUG nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1185.658225] env[67270]: DEBUG nova.network.neutron [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1185.660634] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1185.667341] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1185.685180] env[67270]: DEBUG oslo_concurrency.lockutils [None req-c44a18c0-3b0b-4571-94de-4d5d0f3be4a4 tempest-ServerDiagnosticsV248Test-1333813558 tempest-ServerDiagnosticsV248Test-1333813558-project-member] Lock "8b43a9a6-b28c-43ed-9f83-02424f73dc3c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.987s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.731674] env[67270]: DEBUG oslo_vmware.rw_handles [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1185.731674] env[67270]: DEBUG nova.network.neutron [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1185.786467] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Releasing lock "refresh_cache-8ddc70e6-ec6f-4740-8109-6ba2c5d00536" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1185.786673] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1185.786814] env[67270]: DEBUG nova.compute.manager [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Skipping network deallocation for instance since networking was not requested. {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1185.791821] env[67270]: DEBUG oslo_vmware.rw_handles [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1185.792614] env[67270]: DEBUG oslo_vmware.rw_handles [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1185.833419] env[67270]: DEBUG oslo_concurrency.lockutils [None req-51859bcb-450f-4616-abb5-da7b84b7cbbb tempest-ServerShowV247Test-1023161172 tempest-ServerShowV247Test-1023161172-project-member] Lock "8ddc70e6-ec6f-4740-8109-6ba2c5d00536" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.921s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.864938] env[67270]: DEBUG neutronclient.v2_0.client [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=67270) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1185.866624] env[67270]: ERROR nova.compute.manager [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] result = getattr(controller, method)(*args, **kwargs) [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._get(image_id) [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1185.866624] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] resp, body = self.http_client.get(url, headers=header) [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.request(url, 'GET', **kwargs) [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._handle_response(resp) [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exc.from_response(resp, resp.content) [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] During handling of the above exception, another exception occurred: [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.867127] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.driver.spawn(context, instance, image_meta, [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._fetch_image_if_missing(context, vi) [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image_fetch(context, vi, tmp_image_ds_loc) [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] images.fetch_image( [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] metadata = IMAGE_API.get(context, image_ref) [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1185.867535] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return session.show(context, image_id, [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] _reraise_translated_image_exception(image_id) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise new_exc.with_traceback(exc_trace) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] result = getattr(controller, method)(*args, **kwargs) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._get(image_id) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1185.868792] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] resp, body = self.http_client.get(url, headers=header) [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.request(url, 'GET', **kwargs) [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._handle_response(resp) [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exc.from_response(resp, resp.content) [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] nova.exception.ImageNotAuthorized: Not authorized for image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a. [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] During handling of the above exception, another exception occurred: [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.869348] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._build_and_run_instance(context, instance, image, [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] with excutils.save_and_reraise_exception(): [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.force_reraise() [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise self.value [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] with self.rt.instance_claim(context, instance, node, allocs, [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.abort() [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1185.869855] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return f(*args, **kwargs) [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._unset_instance_host_and_node(instance) [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] instance.save() [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] updates, result = self.indirection_api.object_action( [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return cctxt.call(context, 'object_action', objinst=objinst, [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1185.870401] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] result = self.transport._send( [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._driver.send(target, ctxt, message, [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise result [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] nova.exception_Remote.InstanceNotFound_Remote: Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb could not be found. [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return getattr(target, method)(*args, **kwargs) [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.870897] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return fn(self, *args, **kwargs) [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] old_ref, inst_ref = db.instance_update_and_get_original( [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return f(*args, **kwargs) [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] with excutils.save_and_reraise_exception() as ectxt: [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.force_reraise() [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871292] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise self.value [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return f(*args, **kwargs) [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return f(context, *args, **kwargs) [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exception.InstanceNotFound(instance_id=uuid) [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.871674] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] nova.exception.InstanceNotFound: Instance 2de499d5-2eb3-4138-8c6b-41fb94ff27eb could not be found. [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] During handling of the above exception, another exception occurred: [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] exception_handler_v20(status_code, error_body) [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise client_exc(message=error_message, [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Neutron server returns request_ids: ['req-6ca48b39-9169-489a-b4e6-d0f4bdfcf8e0'] [ 1185.872063] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] During handling of the above exception, another exception occurred: [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Traceback (most recent call last): [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._deallocate_network(context, instance, requested_networks) [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self.network_api.deallocate_for_instance( [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] data = neutron.list_ports(**search_opts) [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1185.872468] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.list('ports', self.ports_path, retrieve_all, [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] for r in self._pagination(collection, path, **params): [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] res = self.get(path, params=params) [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.retry_request("GET", action, body=body, [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.872850] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] return self.do_request(method, action, body=body, [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] ret = obj(*args, **kwargs) [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] self._handle_fault_response(status_code, replybody, resp) [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] raise exception.Unauthorized() [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] nova.exception.Unauthorized: Not authorized. [ 1185.873186] env[67270]: ERROR nova.compute.manager [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] [ 1185.889441] env[67270]: DEBUG oslo_concurrency.lockutils [None req-eb611993-5f1a-4702-9cb7-9182489dc486 tempest-ListServerFiltersTestJSON-1430927038 tempest-ListServerFiltersTestJSON-1430927038-project-member] Lock "2de499d5-2eb3-4138-8c6b-41fb94ff27eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.416s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1229.758616] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1233.984400] env[67270]: WARNING oslo_vmware.rw_handles [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1233.984400] env[67270]: ERROR oslo_vmware.rw_handles [ 1233.985350] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1233.986585] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1233.986834] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Copying Virtual Disk [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/5e83ec70-4fab-4572-85df-50e7c4f52116/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1233.987488] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-44e5f8e6-4f5d-4cbd-8061-7e51048c773b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1233.995948] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Waiting for the task: (returnval){ [ 1233.995948] env[67270]: value = "task-4110706" [ 1233.995948] env[67270]: _type = "Task" [ 1233.995948] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1234.004675] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Task: {'id': task-4110706, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1234.506154] env[67270]: DEBUG oslo_vmware.exceptions [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1234.506425] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1234.507093] env[67270]: ERROR nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1234.507093] env[67270]: Faults: ['InvalidArgument'] [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Traceback (most recent call last): [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] yield resources [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self.driver.spawn(context, instance, image_meta, [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self._fetch_image_if_missing(context, vi) [ 1234.507093] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] image_cache(vi, tmp_image_ds_loc) [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] vm_util.copy_virtual_disk( [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] session._wait_for_task(vmdk_copy_task) [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return self.wait_for_task(task_ref) [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return evt.wait() [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] result = hub.switch() [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1234.507651] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return self.greenlet.switch() [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self.f(*self.args, **self.kw) [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] raise exceptions.translate_fault(task_info.error) [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Faults: ['InvalidArgument'] [ 1234.507998] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] [ 1234.507998] env[67270]: INFO nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Terminating instance [ 1234.510357] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1234.510557] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1234.511407] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08529ca2-294d-46a3-944f-7fd63257cbcc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1234.518966] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1234.519212] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d80bc187-082b-4c65-9cec-f28c3b59ed87 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1234.589395] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1234.589682] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1234.589917] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Deleting the datastore file [datastore1] 65509bc1-a140-416a-a465-4c9e6efce4a0 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1234.590286] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a3653c73-9dcd-4e20-9fe8-a4f2294ca9ce {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1234.597593] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Waiting for the task: (returnval){ [ 1234.597593] env[67270]: value = "task-4110708" [ 1234.597593] env[67270]: _type = "Task" [ 1234.597593] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1234.605813] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Task: {'id': task-4110708, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1234.764690] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1234.764922] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1235.109567] env[67270]: DEBUG oslo_vmware.api [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Task: {'id': task-4110708, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067533} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1235.109932] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1235.109932] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1235.110118] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1235.111174] env[67270]: INFO nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1235.112574] env[67270]: DEBUG nova.compute.claims [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1235.112749] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1235.112963] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1235.232636] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Refreshing inventories for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1235.247885] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Updating ProviderTree inventory for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1235.248144] env[67270]: DEBUG nova.compute.provider_tree [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Updating inventory in ProviderTree for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1235.260063] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Refreshing aggregate associations for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd, aggregates: None {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1235.276969] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Refreshing trait associations for resource provider ddbaf518-603f-4953-8d5d-25c9ed7292bd, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO {{(pid=67270) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1235.303091] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8675906-89b3-4697-b7f5-5f6f9dea5d34 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1235.312188] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fbb99ea-2d70-4b68-b50d-873c6a37b658 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1235.342173] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535ebf8e-b66d-4617-a821-c85cc87843b6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1235.350432] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1dda4da-0e82-473f-9fdc-fb577eeaba38 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1235.364946] env[67270]: DEBUG nova.compute.provider_tree [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1235.373901] env[67270]: DEBUG nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1235.388324] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1235.388869] env[67270]: ERROR nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1235.388869] env[67270]: Faults: ['InvalidArgument'] [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Traceback (most recent call last): [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self.driver.spawn(context, instance, image_meta, [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self._fetch_image_if_missing(context, vi) [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] image_cache(vi, tmp_image_ds_loc) [ 1235.388869] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] vm_util.copy_virtual_disk( [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] session._wait_for_task(vmdk_copy_task) [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return self.wait_for_task(task_ref) [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return evt.wait() [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] result = hub.switch() [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] return self.greenlet.switch() [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1235.389247] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] self.f(*self.args, **self.kw) [ 1235.389601] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1235.389601] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] raise exceptions.translate_fault(task_info.error) [ 1235.389601] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1235.389601] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Faults: ['InvalidArgument'] [ 1235.389601] env[67270]: ERROR nova.compute.manager [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] [ 1235.389750] env[67270]: DEBUG nova.compute.utils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1235.391292] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Build of instance 65509bc1-a140-416a-a465-4c9e6efce4a0 was re-scheduled: A specified parameter was not correct: fileType [ 1235.391292] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1235.391688] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1235.391866] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1235.392047] env[67270]: DEBUG nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1235.392266] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1235.622516] env[67270]: DEBUG nova.network.neutron [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1235.633830] env[67270]: INFO nova.compute.manager [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] [instance: 65509bc1-a140-416a-a465-4c9e6efce4a0] Took 0.24 seconds to deallocate network for instance. [ 1235.723655] env[67270]: INFO nova.scheduler.client.report [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Deleted allocations for instance 65509bc1-a140-416a-a465-4c9e6efce4a0 [ 1235.739090] env[67270]: DEBUG oslo_concurrency.lockutils [None req-08c649aa-fdc6-41c9-ad91-5851495c1c8f tempest-ServersTestJSON-1279423063 tempest-ServersTestJSON-1279423063-project-member] Lock "65509bc1-a140-416a-a465-4c9e6efce4a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.211s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1235.758058] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1236.763043] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1237.758743] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1237.759067] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1237.759259] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1237.768686] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1237.768963] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1237.769071] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1237.780506] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1237.780770] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1237.780986] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1237.781167] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1237.782314] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cafedfce-0f4f-4c52-90cf-208deefe6c45 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.792182] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-916e259a-093c-474b-a413-b5c850642e65 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.807020] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3bde63-cc2c-45b7-b31d-63900cd70d83 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.814459] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a8d3a8-c677-411f-acdb-2a18ed638534 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.845524] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180823MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1237.845704] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1237.845890] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1237.882250] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1237.882514] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1237.898408] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7c229d-4b89-4bd1-ad07-31612b3c14d2 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.906657] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14458105-eee5-47bd-a481-370b008e63c8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.947892] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fd01ebe-d85f-4b9a-ad16-48c5198cc940 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.956911] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fdba611-e16c-4323-b998-fa229e405e0b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1237.971786] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1237.980854] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1237.993749] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1237.993749] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1239.758449] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1239.758789] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Cleaning up deleted instances {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1239.789057] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] There are 5 instances to clean {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1239.791616] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 3273613a-db47-4af9-b3a5-d0dedffd3332] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1239.817541] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 87ef9733-e8d6-429e-b23f-8b8aadef784c] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1239.850257] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 49292f00-1457-438b-b5b7-2ac35dd464d2] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1239.881152] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 2de499d5-2eb3-4138-8c6b-41fb94ff27eb] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1239.908693] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 8ddc70e6-ec6f-4740-8109-6ba2c5d00536] Instance has had 0 of 5 cleanup attempts {{(pid=67270) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1240.925844] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1240.926126] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1241.003729] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1241.004028] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1241.013690] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1241.060679] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1241.060990] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1241.062622] env[67270]: INFO nova.compute.claims [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1241.132205] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd088086-bd61-4928-8449-ea48fd2d2dcc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.140234] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c2374b-0a51-4d13-ba55-af8120bc4684 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.171206] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f8ca897-0185-43e9-9ea0-7bc3a6734e8b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.179570] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c2a508e-0e64-4433-a800-ec701945a77c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.193431] env[67270]: DEBUG nova.compute.provider_tree [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1241.202450] env[67270]: DEBUG nova.scheduler.client.report [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1241.218600] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1241.218600] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1241.250087] env[67270]: DEBUG nova.compute.utils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1241.251681] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1241.251862] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1241.261404] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1241.311568] env[67270]: DEBUG nova.policy [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7ce7404137174d0a98ad8222f93a8969', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c2dc9ac8fae40f2be1eba21d8dfd863', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 1241.325893] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1241.346674] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1241.346934] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1241.347109] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1241.347296] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1241.347442] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1241.347588] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1241.347796] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1241.347950] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1241.348138] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1241.348301] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1241.348507] env[67270]: DEBUG nova.virt.hardware [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1241.349377] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40758aea-b978-45da-858b-d5d2d46773ff {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.358233] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5454e4b-eb06-48f6-afcc-97987e2575f4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.597060] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Successfully created port: 2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1241.752942] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1241.770506] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1242.094890] env[67270]: DEBUG nova.compute.manager [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Received event network-vif-plugged-2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1242.095235] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] Acquiring lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1242.095356] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1242.095560] env[67270]: DEBUG oslo_concurrency.lockutils [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1242.095817] env[67270]: DEBUG nova.compute.manager [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] No waiting events found dispatching network-vif-plugged-2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1242.095940] env[67270]: WARNING nova.compute.manager [req-cc2875b6-aacd-4f2b-9f78-7192c0b1e01e req-718937cf-841f-4d45-b684-9df1675e2292 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Received unexpected event network-vif-plugged-2ed89a43-0ec8-4c60-a0d8-8cf672b95608 for instance with vm_state building and task_state spawning. [ 1242.174432] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Successfully updated port: 2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1242.183419] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1242.183576] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquired lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1242.183729] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1242.220199] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1242.388667] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Updating instance_info_cache with network_info: [{"id": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "address": "fa:16:3e:c5:b6:84", "network": {"id": "54baea01-6ff6-4849-9b1b-0a96ac897b94", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-533148834-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c2dc9ac8fae40f2be1eba21d8dfd863", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4055505f-97ab-400b-969c-43d99b38fd48", "external-id": "nsx-vlan-transportzone-952", "segmentation_id": 952, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ed89a43-0e", "ovs_interfaceid": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1242.402842] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Releasing lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1242.403177] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Instance network_info: |[{"id": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "address": "fa:16:3e:c5:b6:84", "network": {"id": "54baea01-6ff6-4849-9b1b-0a96ac897b94", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-533148834-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c2dc9ac8fae40f2be1eba21d8dfd863", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4055505f-97ab-400b-969c-43d99b38fd48", "external-id": "nsx-vlan-transportzone-952", "segmentation_id": 952, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ed89a43-0e", "ovs_interfaceid": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1242.404113] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c5:b6:84', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4055505f-97ab-400b-969c-43d99b38fd48', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2ed89a43-0ec8-4c60-a0d8-8cf672b95608', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1242.411094] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Creating folder: Project (9c2dc9ac8fae40f2be1eba21d8dfd863). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1242.411605] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-10378de2-b9d7-4267-a6f5-357733575c98 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.423712] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Created folder: Project (9c2dc9ac8fae40f2be1eba21d8dfd863) in parent group-v814248. [ 1242.423923] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Creating folder: Instances. Parent ref: group-v814318. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1242.424172] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3f81c1fe-7c1f-4fc7-9177-1890e8e1ed3a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.434567] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Created folder: Instances in parent group-v814318. [ 1242.434808] env[67270]: DEBUG oslo.service.loopingcall [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1242.434994] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1242.435215] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1e38c4e9-7f99-4012-868c-226945e85dbd {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.459492] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1242.459492] env[67270]: value = "task-4110711" [ 1242.459492] env[67270]: _type = "Task" [ 1242.459492] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.470284] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110711, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1242.758649] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1242.758868] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Cleaning up deleted instances with incomplete migration {{(pid=67270) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1242.970411] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110711, 'name': CreateVM_Task, 'duration_secs': 0.371312} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1242.970591] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1242.971318] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1242.971433] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1242.971767] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1242.972035] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72d58658-2cf7-4157-822e-ab3ee34ce35a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.977194] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Waiting for the task: (returnval){ [ 1242.977194] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52b57244-e3b3-78ae-ee16-b575f30a03d1" [ 1242.977194] env[67270]: _type = "Task" [ 1242.977194] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.987357] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52b57244-e3b3-78ae-ee16-b575f30a03d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1243.487499] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1243.487883] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1243.487979] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1243.488147] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1243.488366] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1243.488605] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7270216c-774c-4206-8207-8895a8a46573 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.496126] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1243.496292] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1243.496985] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c654ee1-877d-41ee-8fa2-f895e47b29b6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.501694] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Waiting for the task: (returnval){ [ 1243.501694] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]525d7389-cd38-3927-387e-8a00a9bb78a1" [ 1243.501694] env[67270]: _type = "Task" [ 1243.501694] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1243.509062] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]525d7389-cd38-3927-387e-8a00a9bb78a1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1244.013305] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1244.013529] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Creating directory with path [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1244.013731] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5648ef20-f0f1-4b18-b470-fe4cbef1f08b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.034653] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Created directory with path [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1244.034846] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Fetch image to [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1244.035084] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1244.035825] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6ee814a-50da-48e6-b841-91bd86e45e53 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.042866] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1556f39e-4d4a-4cba-a195-47dfbbe2d644 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.052124] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ba5c315-1051-4abd-b1e9-c828811c0ba0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.083200] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73d31878-1e97-4c49-8c41-94dfbaf920ab {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.089852] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2d4680a1-e771-46dd-a3a0-ebec6b79fda8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.110034] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1244.129550] env[67270]: DEBUG nova.compute.manager [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Received event network-changed-2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1244.129742] env[67270]: DEBUG nova.compute.manager [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Refreshing instance network info cache due to event network-changed-2ed89a43-0ec8-4c60-a0d8-8cf672b95608. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1244.129953] env[67270]: DEBUG oslo_concurrency.lockutils [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] Acquiring lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1244.130107] env[67270]: DEBUG oslo_concurrency.lockutils [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] Acquired lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1244.130268] env[67270]: DEBUG nova.network.neutron [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Refreshing network info cache for port 2ed89a43-0ec8-4c60-a0d8-8cf672b95608 {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1244.160136] env[67270]: DEBUG oslo_vmware.rw_handles [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1244.218907] env[67270]: DEBUG oslo_vmware.rw_handles [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1244.219203] env[67270]: DEBUG oslo_vmware.rw_handles [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1244.419882] env[67270]: DEBUG nova.network.neutron [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Updated VIF entry in instance network info cache for port 2ed89a43-0ec8-4c60-a0d8-8cf672b95608. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1244.420271] env[67270]: DEBUG nova.network.neutron [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Updating instance_info_cache with network_info: [{"id": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "address": "fa:16:3e:c5:b6:84", "network": {"id": "54baea01-6ff6-4849-9b1b-0a96ac897b94", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-533148834-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c2dc9ac8fae40f2be1eba21d8dfd863", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4055505f-97ab-400b-969c-43d99b38fd48", "external-id": "nsx-vlan-transportzone-952", "segmentation_id": 952, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ed89a43-0e", "ovs_interfaceid": "2ed89a43-0ec8-4c60-a0d8-8cf672b95608", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1244.429159] env[67270]: DEBUG oslo_concurrency.lockutils [req-d2cbcfa9-7cd4-4fc0-84aa-04afc02d3080 req-c3e80a7b-d2ff-4242-98df-ca175c94b467 service nova] Releasing lock "refresh_cache-13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1276.318295] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_power_states {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1276.329607] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Getting list of instances from cluster (obj){ [ 1276.329607] env[67270]: value = "domain-c8" [ 1276.329607] env[67270]: _type = "ClusterComputeResource" [ 1276.329607] env[67270]: } {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1276.330694] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da94561c-9732-4e5d-97c7-2a37d423526a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1276.340705] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Got total of 1 instances {{(pid=67270) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1276.340888] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Triggering sync for uuid 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16 {{(pid=67270) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1276.341423] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1291.541549] env[67270]: WARNING oslo_vmware.rw_handles [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1291.541549] env[67270]: ERROR oslo_vmware.rw_handles [ 1291.542348] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1291.543782] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1291.544037] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Copying Virtual Disk [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/6039ab3c-1b27-4789-8ced-1f637b566c4b/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1291.544317] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-03cea52d-740c-40ea-833f-625c5303d375 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.554276] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Waiting for the task: (returnval){ [ 1291.554276] env[67270]: value = "task-4110712" [ 1291.554276] env[67270]: _type = "Task" [ 1291.554276] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1291.562854] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Task: {'id': task-4110712, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1292.065350] env[67270]: DEBUG oslo_vmware.exceptions [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1292.065630] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1292.066283] env[67270]: ERROR nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.066283] env[67270]: Faults: ['InvalidArgument'] [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Traceback (most recent call last): [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] yield resources [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self.driver.spawn(context, instance, image_meta, [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self._fetch_image_if_missing(context, vi) [ 1292.066283] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] image_cache(vi, tmp_image_ds_loc) [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] vm_util.copy_virtual_disk( [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] session._wait_for_task(vmdk_copy_task) [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return self.wait_for_task(task_ref) [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return evt.wait() [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] result = hub.switch() [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1292.066708] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return self.greenlet.switch() [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self.f(*self.args, **self.kw) [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] raise exceptions.translate_fault(task_info.error) [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Faults: ['InvalidArgument'] [ 1292.067214] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] [ 1292.067214] env[67270]: INFO nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Terminating instance [ 1292.069277] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1292.069479] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1292.070266] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ff299c-d56a-4398-865c-3f26af6ad121 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.078053] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1292.078280] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-63bc6173-024d-453e-886f-8f59bf2ecd49 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.145782] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1292.146100] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1292.146225] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Deleting the datastore file [datastore1] 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16 {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1292.146461] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4328ce76-1b2d-43bb-9e5d-bd7403990937 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.153473] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Waiting for the task: (returnval){ [ 1292.153473] env[67270]: value = "task-4110714" [ 1292.153473] env[67270]: _type = "Task" [ 1292.153473] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1292.161681] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Task: {'id': task-4110714, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1292.664261] env[67270]: DEBUG oslo_vmware.api [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Task: {'id': task-4110714, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070405} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1292.664665] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1292.664707] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1292.664844] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1292.665020] env[67270]: INFO nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1292.667247] env[67270]: DEBUG nova.compute.claims [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1292.667415] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1292.667630] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1292.728013] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e25be89-7db5-4303-bd65-e8f8a805de94 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.736319] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b2acfe5-bf60-48f6-8de0-3a4c44c7321b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.766939] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cb004a4-f44e-44e0-bb1e-bdba7a62d63e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.774810] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cad8810a-dac3-4bee-8b82-ee8ca9b981a9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.788157] env[67270]: DEBUG nova.compute.provider_tree [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1292.796856] env[67270]: DEBUG nova.scheduler.client.report [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1292.810077] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.142s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1292.810625] env[67270]: ERROR nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.810625] env[67270]: Faults: ['InvalidArgument'] [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Traceback (most recent call last): [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self.driver.spawn(context, instance, image_meta, [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self._fetch_image_if_missing(context, vi) [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] image_cache(vi, tmp_image_ds_loc) [ 1292.810625] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] vm_util.copy_virtual_disk( [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] session._wait_for_task(vmdk_copy_task) [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return self.wait_for_task(task_ref) [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return evt.wait() [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] result = hub.switch() [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] return self.greenlet.switch() [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1292.810960] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] self.f(*self.args, **self.kw) [ 1292.811273] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1292.811273] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] raise exceptions.translate_fault(task_info.error) [ 1292.811273] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1292.811273] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Faults: ['InvalidArgument'] [ 1292.811273] env[67270]: ERROR nova.compute.manager [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] [ 1292.811397] env[67270]: DEBUG nova.compute.utils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1292.812732] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Build of instance 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16 was re-scheduled: A specified parameter was not correct: fileType [ 1292.812732] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1292.813115] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1292.813309] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1292.813486] env[67270]: DEBUG nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1292.813645] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1293.045801] env[67270]: DEBUG nova.network.neutron [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1293.059676] env[67270]: INFO nova.compute.manager [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] Took 0.25 seconds to deallocate network for instance. [ 1293.144028] env[67270]: INFO nova.scheduler.client.report [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Deleted allocations for instance 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16 [ 1293.164615] env[67270]: DEBUG oslo_concurrency.lockutils [None req-e6af91a7-1e62-4425-bc0d-4141ad0d24f6 tempest-InstanceActionsTestJSON-1111197031 tempest-InstanceActionsTestJSON-1111197031-project-member] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.160s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1293.164918] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 16.823s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1293.165133] env[67270]: INFO nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 13cec5ce-b04b-4dd5-bef8-5e861ea2ff16] During sync_power_state the instance has a pending task (spawning). Skip. [ 1293.165364] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "13cec5ce-b04b-4dd5-bef8-5e861ea2ff16" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1294.758219] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1294.761832] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1296.759567] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1296.759842] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1297.759541] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1298.608761] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1298.609137] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1298.619939] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Starting instance... {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1298.665667] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1298.665916] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1298.667502] env[67270]: INFO nova.compute.claims [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1298.745683] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3f4d14-c316-46f1-93c6-ea8d8262fc2b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1298.753950] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c6d3487-5ab2-42ea-96be-273c17be306a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1298.757522] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1298.757739] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1298.757871] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1298.785528] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Skipping network cache update for instance because it is Building. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1298.785686] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1298.786389] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b20691c0-6733-400f-b906-2cd8f052101c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1298.794870] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8f8c670-7c75-4ea2-ace6-b5dbb2406a89 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1298.809247] env[67270]: DEBUG nova.compute.provider_tree [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1298.817764] env[67270]: DEBUG nova.scheduler.client.report [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1298.831108] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1298.831591] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Start building networks asynchronously for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1298.864675] env[67270]: DEBUG nova.compute.utils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Using /dev/sd instead of None {{(pid=67270) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1298.865846] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Allocating IP information in the background. {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1298.866106] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] allocate_for_instance() {{(pid=67270) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1298.875189] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Start building block device mappings for instance. {{(pid=67270) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1298.924612] env[67270]: DEBUG nova.policy [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ad9f1a9add3b4e719967b70d30371460', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b054a43e94514fdbb67f73e8a4cff197', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=67270) authorize /opt/stack/nova/nova/policy.py:203}} [ 1298.942614] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Start spawning the instance on the hypervisor. {{(pid=67270) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1298.966704] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-05-14T00:54:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-05-14T00:53:51Z,direct_url=,disk_format='vmdk',id=1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='b4cc8d13a7354de8be4a029915d283ac',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-05-14T00:53:51Z,virtual_size=,visibility=), allow threads: False {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1298.966953] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Flavor limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1298.967127] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Image limits 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1298.967376] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Flavor pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1298.967476] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Image pref 0:0:0 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1298.967597] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=67270) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1298.967806] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1298.967967] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1298.968160] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Got 1 possible topologies {{(pid=67270) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1298.968332] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1298.968568] env[67270]: DEBUG nova.virt.hardware [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=67270) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1298.969446] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-433ad05d-532b-451a-949a-fa66cdb9c413 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1298.978125] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24664739-1415-4872-95c1-59937b014503 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.223130] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Successfully created port: 5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1299.745498] env[67270]: DEBUG nova.compute.manager [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Received event network-vif-plugged-5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1299.745766] env[67270]: DEBUG oslo_concurrency.lockutils [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] Acquiring lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1299.745939] env[67270]: DEBUG oslo_concurrency.lockutils [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] Lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1299.746116] env[67270]: DEBUG oslo_concurrency.lockutils [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] Lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1299.746301] env[67270]: DEBUG nova.compute.manager [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] No waiting events found dispatching network-vif-plugged-5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1299.746468] env[67270]: WARNING nova.compute.manager [req-05d12241-8efd-4d88-84e5-e59ab159c799 req-32c78dd2-c331-4678-84ab-374be9e1046d service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Received unexpected event network-vif-plugged-5b49775b-a014-42a8-a55e-bfd9bb2ac56a for instance with vm_state building and task_state spawning. [ 1299.757661] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1299.767286] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1299.767484] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1299.767653] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1299.767806] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1299.768884] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfb0a78d-9816-498c-9ed5-83e55858b0c9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.777586] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-153d2e3d-ceac-412f-ac5f-f42123c5a826 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.793174] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7bd3b85-7770-4eb6-9d73-40978cc80113 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.801899] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22cc22d8-02e7-4cd8-bb68-5b56097f4532 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.834026] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180810MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1299.834390] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1299.834726] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1299.837171] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Successfully updated port: 5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1299.846244] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1299.846244] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquired lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1299.846476] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Building network info cache for instance {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1299.878869] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Instance 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=67270) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1299.879154] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1299.879378] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1299.882645] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Instance cache missing network info. {{(pid=67270) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1299.909340] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5808e857-376f-4036-a967-e88866a419b0 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.917563] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56a6f79-03b5-4023-b651-e63260730e35 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.952944] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caf6948d-8a4e-494c-b837-79d2aa3de3ac {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.961464] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33b21350-a156-4c23-8f40-488c11bc6d11 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1299.975740] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1299.985322] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1300.034437] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1300.034636] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1300.080589] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Updating instance_info_cache with network_info: [{"id": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "address": "fa:16:3e:21:f7:2d", "network": {"id": "e634920d-a39c-4043-ba9b-f5bffd4832d4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1362496382-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b054a43e94514fdbb67f73e8a4cff197", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35ac9709-fd8b-4630-897a-68ed629d1b11", "external-id": "nsx-vlan-transportzone-284", "segmentation_id": 284, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b49775b-a0", "ovs_interfaceid": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1300.091849] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Releasing lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1300.092181] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Instance network_info: |[{"id": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "address": "fa:16:3e:21:f7:2d", "network": {"id": "e634920d-a39c-4043-ba9b-f5bffd4832d4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1362496382-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b054a43e94514fdbb67f73e8a4cff197", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35ac9709-fd8b-4630-897a-68ed629d1b11", "external-id": "nsx-vlan-transportzone-284", "segmentation_id": 284, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b49775b-a0", "ovs_interfaceid": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=67270) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1300.092570] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:21:f7:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35ac9709-fd8b-4630-897a-68ed629d1b11', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5b49775b-a014-42a8-a55e-bfd9bb2ac56a', 'vif_model': 'vmxnet3'}] {{(pid=67270) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1300.100425] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Creating folder: Project (b054a43e94514fdbb67f73e8a4cff197). Parent ref: group-v814248. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1300.100973] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aeefc570-4951-47ef-ab41-51851ea6588b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1300.113025] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Created folder: Project (b054a43e94514fdbb67f73e8a4cff197) in parent group-v814248. [ 1300.113241] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Creating folder: Instances. Parent ref: group-v814321. {{(pid=67270) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1300.113493] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ccb895ce-f24e-4f4f-8d35-5894da395d2a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1300.125988] env[67270]: INFO nova.virt.vmwareapi.vm_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Created folder: Instances in parent group-v814321. [ 1300.126244] env[67270]: DEBUG oslo.service.loopingcall [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=67270) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1300.126432] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Creating VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1300.126635] env[67270]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a3ae3a3a-facb-4181-b189-298ae540966f {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1300.146572] env[67270]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1300.146572] env[67270]: value = "task-4110717" [ 1300.146572] env[67270]: _type = "Task" [ 1300.146572] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1300.156834] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110717, 'name': CreateVM_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1300.656951] env[67270]: DEBUG oslo_vmware.api [-] Task: {'id': task-4110717, 'name': CreateVM_Task, 'duration_secs': 0.294961} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1300.657138] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Created VM on the ESX host {{(pid=67270) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1300.663918] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1300.664109] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1300.664438] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1300.664680] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-89571140-fa0a-4182-b4c0-29b7f9e8d84e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1300.669512] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Waiting for the task: (returnval){ [ 1300.669512] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]5207c986-47b9-d2ef-fa80-0c4046469853" [ 1300.669512] env[67270]: _type = "Task" [ 1300.669512] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1300.677484] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]5207c986-47b9-d2ef-fa80-0c4046469853, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1301.030330] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1301.179685] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1301.179938] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Processing image 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1301.180193] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1301.180347] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquired lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1301.180529] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.180769] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ceec369b-04cd-43d5-b064-c660c11f0ab4 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.198884] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.199105] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=67270) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1301.200044] env[67270]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2da18519-f747-4cd5-85e3-272a9467ce97 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.206199] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Waiting for the task: (returnval){ [ 1301.206199] env[67270]: value = "session[52a9282c-3673-4999-fafd-672c2351ecce]52c56f94-3f23-0bee-fbb5-577b56af50fb" [ 1301.206199] env[67270]: _type = "Task" [ 1301.206199] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1301.218810] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Task: {'id': session[52a9282c-3673-4999-fafd-672c2351ecce]52c56f94-3f23-0bee-fbb5-577b56af50fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1301.716663] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Preparing fetch location {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1301.716923] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Creating directory with path [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1301.717169] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6ad6d465-24d3-4a45-9947-b06398b0ff07 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.737491] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Created directory with path [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a {{(pid=67270) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1301.737777] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Fetch image to [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1301.737958] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1301.738741] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-409c1ab7-db28-4fe1-a852-95058d3ef607 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.746052] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48c5d7dc-e8b1-4a59-986d-d1043840ced9 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.755667] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1cdcc96-bc74-4681-8a9e-0901f681010d {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.787684] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a4a7d5-4321-4618-b402-218b806d9b0a {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.791109] env[67270]: DEBUG nova.compute.manager [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Received event network-changed-5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1301.791303] env[67270]: DEBUG nova.compute.manager [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Refreshing instance network info cache due to event network-changed-5b49775b-a014-42a8-a55e-bfd9bb2ac56a. {{(pid=67270) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1301.791508] env[67270]: DEBUG oslo_concurrency.lockutils [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] Acquiring lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1301.791649] env[67270]: DEBUG oslo_concurrency.lockutils [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] Acquired lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1301.791807] env[67270]: DEBUG nova.network.neutron [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Refreshing network info cache for port 5b49775b-a014-42a8-a55e-bfd9bb2ac56a {{(pid=67270) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1301.797648] env[67270]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d1ff9963-715e-4c7a-8f68-dc9e19dde494 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1301.823112] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Downloading image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1301.869426] env[67270]: DEBUG oslo_vmware.rw_handles [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1301.929135] env[67270]: DEBUG oslo_vmware.rw_handles [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Completed reading data from the image iterator. {{(pid=67270) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1301.929443] env[67270]: DEBUG oslo_vmware.rw_handles [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=67270) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1302.109259] env[67270]: DEBUG nova.network.neutron [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Updated VIF entry in instance network info cache for port 5b49775b-a014-42a8-a55e-bfd9bb2ac56a. {{(pid=67270) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1302.109665] env[67270]: DEBUG nova.network.neutron [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Updating instance_info_cache with network_info: [{"id": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "address": "fa:16:3e:21:f7:2d", "network": {"id": "e634920d-a39c-4043-ba9b-f5bffd4832d4", "bridge": "br-int", "label": "tempest-ServersTestJSON-1362496382-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b054a43e94514fdbb67f73e8a4cff197", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35ac9709-fd8b-4630-897a-68ed629d1b11", "external-id": "nsx-vlan-transportzone-284", "segmentation_id": 284, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b49775b-a0", "ovs_interfaceid": "5b49775b-a014-42a8-a55e-bfd9bb2ac56a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1302.120520] env[67270]: DEBUG oslo_concurrency.lockutils [req-bdadde82-a60b-4251-8e2a-e3bb1a9463e5 req-51c99bd6-da24-4f71-ab6f-a04df5a5cc92 service nova] Releasing lock "refresh_cache-92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1302.758148] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1303.758571] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1349.024503] env[67270]: WARNING oslo_vmware.rw_handles [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles response.begin() [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1349.024503] env[67270]: ERROR oslo_vmware.rw_handles [ 1349.025307] env[67270]: DEBUG nova.virt.vmwareapi.images [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Downloaded image file data 1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a to vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk on the data store datastore1 {{(pid=67270) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1349.026768] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Caching image {{(pid=67270) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1349.027045] env[67270]: DEBUG nova.virt.vmwareapi.vm_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Copying Virtual Disk [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/tmp-sparse.vmdk to [datastore1] vmware_temp/573407f6-bda4-4eb3-a1c2-f86e7e43027a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk {{(pid=67270) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1349.027356] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-24cd79af-640a-4052-8a66-9650408ef422 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1349.039412] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Waiting for the task: (returnval){ [ 1349.039412] env[67270]: value = "task-4110718" [ 1349.039412] env[67270]: _type = "Task" [ 1349.039412] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1349.047568] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Task: {'id': task-4110718, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1349.549774] env[67270]: DEBUG oslo_vmware.exceptions [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Fault InvalidArgument not matched. {{(pid=67270) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1349.550058] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Releasing lock "[datastore1] devstack-image-cache_base/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a/1a1a1fd7-2b6f-4b91-96e8-a30fb1d9e28a.vmdk" {{(pid=67270) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1349.550587] env[67270]: ERROR nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.550587] env[67270]: Faults: ['InvalidArgument'] [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Traceback (most recent call last): [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] yield resources [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self.driver.spawn(context, instance, image_meta, [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self._fetch_image_if_missing(context, vi) [ 1349.550587] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] image_cache(vi, tmp_image_ds_loc) [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] vm_util.copy_virtual_disk( [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] session._wait_for_task(vmdk_copy_task) [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return self.wait_for_task(task_ref) [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return evt.wait() [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] result = hub.switch() [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1349.551130] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return self.greenlet.switch() [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self.f(*self.args, **self.kw) [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] raise exceptions.translate_fault(task_info.error) [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Faults: ['InvalidArgument'] [ 1349.551521] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] [ 1349.551521] env[67270]: INFO nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Terminating instance [ 1349.554028] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Start destroying the instance on the hypervisor. {{(pid=67270) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1349.554232] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Destroying instance {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1349.555073] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40adbaea-d011-4ad8-ab82-2d4fbaefa4d6 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1349.562114] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Unregistering the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1349.562335] env[67270]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d12dc6fb-3059-4613-b372-e496853af419 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1349.635276] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Unregistered the VM {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1349.635559] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Deleting contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1349.635655] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Deleting the datastore file [datastore1] 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1349.635925] env[67270]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b4137bfa-710b-4b97-8f1b-052d6e00cf2c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1349.642561] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Waiting for the task: (returnval){ [ 1349.642561] env[67270]: value = "task-4110720" [ 1349.642561] env[67270]: _type = "Task" [ 1349.642561] env[67270]: } to complete. {{(pid=67270) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1349.650455] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Task: {'id': task-4110720, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1350.153135] env[67270]: DEBUG oslo_vmware.api [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Task: {'id': task-4110720, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078571} completed successfully. {{(pid=67270) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1350.153500] env[67270]: DEBUG nova.virt.vmwareapi.ds_util [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Deleted the datastore file {{(pid=67270) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1350.153500] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Deleted contents of the VM from datastore datastore1 {{(pid=67270) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1350.153678] env[67270]: DEBUG nova.virt.vmwareapi.vmops [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Instance destroyed {{(pid=67270) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1350.153904] env[67270]: INFO nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1350.156422] env[67270]: DEBUG nova.compute.claims [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Aborting claim: {{(pid=67270) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1350.156599] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1350.156820] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1350.221232] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e63110e-d0d6-42ed-9afe-8891939242de {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.229268] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57fc105b-a9ef-4129-aebd-921b694fc54e {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.260384] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03a1cd78-7349-4455-8f1c-e352f3f47120 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.268399] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5f71131-c192-493a-8345-9c68611d9ea5 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.284437] env[67270]: DEBUG nova.compute.provider_tree [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1350.293074] env[67270]: DEBUG nova.scheduler.client.report [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1350.306435] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.149s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1350.307043] env[67270]: ERROR nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.307043] env[67270]: Faults: ['InvalidArgument'] [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Traceback (most recent call last): [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self.driver.spawn(context, instance, image_meta, [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self._fetch_image_if_missing(context, vi) [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] image_cache(vi, tmp_image_ds_loc) [ 1350.307043] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] vm_util.copy_virtual_disk( [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] session._wait_for_task(vmdk_copy_task) [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return self.wait_for_task(task_ref) [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return evt.wait() [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] result = hub.switch() [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] return self.greenlet.switch() [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1350.307442] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] self.f(*self.args, **self.kw) [ 1350.307757] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1350.307757] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] raise exceptions.translate_fault(task_info.error) [ 1350.307757] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1350.307757] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Faults: ['InvalidArgument'] [ 1350.307757] env[67270]: ERROR nova.compute.manager [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] [ 1350.307757] env[67270]: DEBUG nova.compute.utils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] VimFaultException {{(pid=67270) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1350.309314] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Build of instance 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff was re-scheduled: A specified parameter was not correct: fileType [ 1350.309314] env[67270]: Faults: ['InvalidArgument'] {{(pid=67270) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1350.309689] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Unplugging VIFs for instance {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1350.309863] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=67270) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1350.310091] env[67270]: DEBUG nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Deallocating network for instance {{(pid=67270) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1350.310291] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] deallocate_for_instance() {{(pid=67270) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1350.577577] env[67270]: DEBUG nova.network.neutron [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Updating instance_info_cache with network_info: [] {{(pid=67270) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1350.588288] env[67270]: INFO nova.compute.manager [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] [instance: 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff] Took 0.28 seconds to deallocate network for instance. [ 1350.671619] env[67270]: INFO nova.scheduler.client.report [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Deleted allocations for instance 92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff [ 1350.688664] env[67270]: DEBUG oslo_concurrency.lockutils [None req-16f63814-74e7-442b-a2d0-0f0afcfe7bb1 tempest-ServersTestJSON-1045406017 tempest-ServersTestJSON-1045406017-project-member] Lock "92fdc97e-d0a1-48a1-9e3e-9ee72342b4ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.079s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1356.759851] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1356.760286] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1356.760286] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=67270) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1357.759612] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1358.758263] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1359.758262] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1359.758653] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Starting heal instance info cache {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1359.758653] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Rebuilding the list of instances to heal {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1359.767761] env[67270]: DEBUG nova.compute.manager [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Didn't find any instances for network info cache update. {{(pid=67270) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1360.758668] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager.update_available_resource {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1360.792442] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1360.792442] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1360.792442] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1360.792442] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=67270) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1360.792442] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd0fb7cd-cb8f-4edd-9f37-413bfc02de38 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.792632] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0a9d721-edfc-416b-99a0-109ccb599589 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.795784] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d88832f-6085-4c36-9f08-0270d78d11c8 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.802676] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-169ca942-7247-47f7-98e4-7aad6cc8dacc {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.833548] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180802MB free_disk=16GB free_vcpus=48 pci_devices=None {{(pid=67270) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1360.833717] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1360.833899] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1360.864116] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1360.864305] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=67270) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1360.878261] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75b4680e-2185-48db-ab5a-9727fbf9256b {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.885939] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f89d44-8a8f-44a1-8ced-464658c648a3 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.916494] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e4776a-3e4b-46f3-b928-85c70d387d2c {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.925274] env[67270]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25bfa595-6979-4ce3-9258-776a54449e37 {{(pid=67270) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1360.941404] env[67270]: DEBUG nova.compute.provider_tree [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed in ProviderTree for provider: ddbaf518-603f-4953-8d5d-25c9ed7292bd {{(pid=67270) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1360.950176] env[67270]: DEBUG nova.scheduler.client.report [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Inventory has not changed for provider ddbaf518-603f-4953-8d5d-25c9ed7292bd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=67270) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1360.967051] env[67270]: DEBUG nova.compute.resource_tracker [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=67270) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1360.967051] env[67270]: DEBUG oslo_concurrency.lockutils [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s {{(pid=67270) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1361.961806] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1363.753479] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1364.758695] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1365.758526] env[67270]: DEBUG oslo_service.periodic_task [None req-8edfbd7e-02ea-44ee-9e53-079cb6f83a03 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=67270) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}